NRS Data Monitoring for Program Improvement - PowerPoint PPT Presentation

About This Presentation
Title:

NRS Data Monitoring for Program Improvement

Description:

Set policy for rewards and sanctions for local programs; ... Exercise. The Data-driven Program Improvement Model. Setting Performance Standards ... – PowerPoint PPT presentation

Number of Views:91
Avg rating:3.0/5.0
Slides: 71
Provided by: MCor4
Category:

less

Transcript and Presenter's Notes

Title: NRS Data Monitoring for Program Improvement


1
NRS Data Monitoring for Program Improvement
  • Unlocking Your Data

2
ObjectivesDay 1
  • Describe the importance of getting involved with
    and using data
  • Identify four models for setting performance
    standards as well as the policy strategies,
    advantages,and disadvantages of each model
  • Determine when and how to adjust standards for
    local conditions
  • Set policy for rewards and sanctions for local
    programs
  • Identify programmatic and instructional elements
    underlying the measures of educational gain, NRS
    follow-up, enrollment, and retention.
  •      

3
AgendaDay 1
  • Welcome, Introduction, Objectives, Agenda Review
  • The Power of Data
  • Why Get Engaged with Data? Exercise
  • The Data-driven Program Improvement Model
  • Setting Performance Standards
  • Adjusting Standards for Local Conditions
  • Establishing a Policy for Rewards and Sanctions
  • Getting Under the Data
  • Data Pyramids
  • Data Carousel
  • Evaluation and Wrap-up for Day 1

4
ObjectivesDay 2
  • Distinguish between the uses of desk reviews and
    on-site monitoring of local programs
  • Identify steps for monitoring local programs
  • Identify and apply key elements of a change
    model and
  • Work with local programs to plan for and
    implement changes that will enhance program
    performance and quality.

5
AgendaDay 2
  • Agenda Review
  • Planning for and Implementing Program Monitoring
  • Desk Reviews Versus On-site Reviews
  • Data Sources (small group work)
  • Steps and Guidelines for Monitoring Local
    Programs
  • Planning for and Implementing Program Improvement
  • A Model of the Program Improvement Process
  • State Action Planning
  • Closing and Evaluation

6
STOP! Why Get Engaged with Data?


7
Question for Consideration
  • Why is it important to be able to produce
    evidence of what your state (or local) adult
    education program achieves for its students?

8
The Motivation Continuum
  • Intrinsic Extrinsic
  • Which is the more powerful force for change?

9
NRS Data-driven Program Improvement (Cyclical
Model)
  • STEPS
  • Set performance standards
  • Examine program elements underlying the data
  • Monitor program data, policy, and procedures
  • Plan and implement program improvement
  • Evaluate progress and revise, as necessary, and
    recycle

10
Whats Under Your Data?The Powerful Ps
__Performance_(Data)_ Program Policies
Procedures Processes Products
11
NRS Data-driven Program Improvement Model
Set Performance Standards
NRS DATA
Examine Program Elements Underlying the Data
Plan and Implement Program Improvement Evaluate
Improvement
Monitor Program Data, Policy, Procedures

12
Educational Gains for ESL Levels and Performance
Standards
Exhibit 1-2
13
Questions Raised by Exhibit 1-2
  • How were performance standards set? Based on past
    performance?
  • Are standards too low at the higher levels?
  • Is performance pattern similar to that of
    previous years? If not, why not?
  • What are programs assessment and placement
    procedures? Same assessments for high and low
    ESL?
  • How do curriculum and instruction differ by
    level?
  • What are student retention patterns by level?

14
The Power of Data Setting Performance Standards
15
Essential Elements of Accountability Systems
  • Goals
  • Measures
  • Performance Standards
  • Sanctions and Rewards

16
National Adult Education Goals
  • Reflected in NRS Outcome Measures of
  • educational gain,
  • GED credential attainment,
  • entry into postsecondary education, and
  • employment.

17
Performance Standards
  • Similar to a sales quota how well are you
    going to perform this year?
  • Should be realistic and attainable, but
  • Should stretch you toward improvement
  • Set by each state in collaboration with ED
  • Each states performance is a reflection of the
    aggregate performance of all the programs it
    funds

18
Standards-setting Models
  • Continuous Improvement
  • Relative Ranking
  • External Criteria
  • Return on Investment (ROI)

19
Continuous Improvement
  • Standard based on past performance
  • Designed to make all programs improve compared to
    themselves
  • Works well when there is stability and a history
    of performance on which to base standard
  • Ceiling reached over time, resulting in little
    additional improvement

20
Relative Ranking
  • Standard is mean or median performance of all
    programs
  • Programs ranked relative to each other
  • Works for stable systems where median performance
    is acceptable
  • Improvement focus mainly on low-performing
    programs
  • Little incentive for high-performing programs to
    improve

21
External Criteria
  • Set by formula or external policy
  • Promotes a policy goal to achieve a higher
    standard
  • Used when large-scale improvements are called
    for, over the long term
  • No consideration of past performance
    unrealistic, unattainable

22
Return on Investment
  • Value of program Cost of program
  • A business model answers question, Are services
    or program worth the investment?
  • Can be a powerful tool for garnering funding
    (high ROI) or for losing funding (low ROI)
  • May ignore other benefits of program

23
Decision Time for State Teams
  • Which model(s) do you favor for setting standards
    for/with locals?
  • Is it appropriate to use one statewide model or
    different models for different programs?
  • How will you involve the locals in setting the
    standards they will be held to?

24
Question for Consideration
  • How do the standard-setting model(s) that
    states select represent a policy statement on the
    relationship between performance and quality that
    states want to instill in local programs?

25
Adjusting Standards for Local Conditions
  • Research suggests that standards often need to
    be adjusted for local conditions before locals
    can work to improve program quality.
  • WHY IS THIS SO?

26
Factors that May Require Adjustment of Standards
  • Student Characteristics
  • An especially challenging group
  • Students at lower end of level
  • Influx of different types of students
  • Local Program Elements
  • External Conditions

27
Shared Accountability
  • State and locals share responsibility to meet
    accountability requirements
  • State provides tools and environment for improved
    performance
  • Locals agree to work toward improving performance

28
Locals should know
  • The purpose of the performance standards
  • The policy and programmatic goals the standards
    are meant to accomplish
  • The standard-setting model that the state adopts
    and
  • That State guidance and support is available to
    locals in effecting change.

29
Shared Accountability
  • Which state-initiated efforts have been easy to
    implement at the local level?
  • Which have not?
  • What factors contributed to locals successfully
    and willingly embracing the effort?
  • What factors contributed to a failed effort?

30
Shared Accountability
31
What About Setting Rewards and Sanctions?
  • Which is the more powerful motivator rewards or
    sanctions?
  • List all the different possible reward structures
    you can think of for local programs.
  • How might sanctioning be counter-productive?
  • List sanctioning methods that will not destroy
    locals motivation to improve or adversely affect
    relationships with the state office.

32
Variations on a Theme Exercise
  • (Refer to H-10). Brainstorm as many possible
    rewards or incentives as you can for recognizing
    local programs that meet their performance
    standards.
  • Then brainstorm sanctions that the state might
    impose on local programs that do not meet their
    performance standards.
  • Select a recorder for your group to write one
    reward per Post-It Note and one sanction per
    Post-It Note.
  • When you have finished, wait for further
    instructions from the facilitator.

33
Summary of Local Performance Standard-setting
Process
Procedure Goal
Select standard-setting model Reflect state policies Promote program improvement
Set rewards and sanctions policy Create incentives Avoid unintended effects
Make local adjustments Ensure standards are fair realistic for all programs
Provide T/A Create atmosphere of shared accountability
Monitor often Identify and avoid potential problems
34
Getting Under the Data
  • NRS data, as measured and reported by states,
    represent the product of underlying programmatic
    and instructional decisions and procedures.

35
Four Sets of Measures
  • Educational gain
  • NRS Follow-up Measures
  • Obtained a secondary credential
  • Entered and retained employment
  • Entered postsecondary education
  • Retention
  • Enrollment

36
Educational Gain
37
Follow-up Measures
38
Retention
39
Enrollment
Enrollment
Community Characteristics
Class Schedules and Locations
R e c r u i t m e n t
Instruction
Professional Development
40
Data Carousel


41
Question for Consideration
  • How might it benefit local programs if the
    State office were to initiate and maintain a
    regular monitoring schedule to compare local
    program performance against performance standards?

42
Regular Monitoring of Performance Compared with
Standards
  • Keeps locals focused on outcomes and processes
  • Highlights issues of importance
  • Increases staff involvement in the process
  • Helps refine data collection processes and
    products
  • Identifies areas for program improvement
  • Identifies promising practices
  • Yields information for decision-making
  • Enhances program accountability.

43
BUT
  • How can states possibly monitor performance of
    all local programs?
  • Dont we have enough to do already??
  • Where will we find staff to conduct the reviews?
  • Youre kidding, right??

44
Not!


45
So.Lets Find Some Answers
  • How can you monitor performance of locals without
    overburdening state staff?
  • What successful models are already out there??
  • How does your state office currently ensure
    local compliance with state requirements?
  • Can you build on existing structures?

46
Approaches to Monitoring
  • Desk Reviews
  • Ongoing process
  • Useful for quantitative data
  • Proposals
  • Performance measures
  • Program improvement plans
  • Staffing patterns
  • Budgets
  • On-site Reviews
  • Single event, lasting 1-3 days
  • Useful for qualitative data
  • Review of processes program quality
  • Input from diverse stakeholders

47
Advantages and Disadvantages of Desk Reviews
Advantages Disadvantages
Data, reports, proposals, etc., already in state office Assumes accurate data that reflect reality
Review can be built into staffs regular workload Local staff and stakeholders not heard
Data is quantitative can be compared to previous years Static view of data no interaction in context
No travel time or costs required No team perspective
48
Advantages and Disadvantages of On-site Reviews
Advantages Disadvantages
Data is qualitative review of processes program quality Stressful for local program and team
Input from perspectives of diverse stakeholders Arranging site visits and team is time-intensive for both locals and state
State works with locals to explore options for improvement provides T/A Requires time out-of-office
Opportunity to recognize strengths offer praise identify best practices Incurs travel costs
49
Data Collection Strategies for Monitoring
  • Program Self-Reviews (PSRs)
  • Document Reviews
  • Observations
  • Interviews

50
Program Self-Reviews
  • Conducted by local program staff
  • Review indicators of program quality
  • Completed in advance of monitoring visit and can
    help focus the on-site review
  • Results can guide the program improvement process

51
Document Reviews
  • Can review from a distance
  • Proposals
  • Qualitative and quantitative reports
  • Improvement plans
  • Can review on-site
  • Student files
  • Attendance records
  • Entry and update records
  • Course evaluations

52
Qualitative and Quantitative Data 
53
Observations
  • Interactions
  • during meetings
  • At intake and orientation
  • In hallways and on grounds
  • In the classroom
  • Link what is observed to
  • Indicators of quality
  • Activities in the program plan
  • Professional development workshops

54
Interviews
  • Help clarify or explore ambiguous findings
  • Provide information re stakeholders opinions,
    knowledge, and needs
  • Administrative, instructional, and support staff
  • Community partners
  • Community agencies (e.g., employment, social
    services)
  • Learners

55
Fill in the Boxes Monitoring with Indicators of
Program Quality
  • In teams of 4-5 and using H-12, fill in the
    data sources you would expect to use, the
    questions you would ask locals, and the
    strategies you would use in conducting a desk
    review versus an on-site review.

56
Steps for Monitoring Local Programs
  1. Identify state policy for monitoring gather
    support from stakeholders.
  2. Consider past practices when specifying scope of
    work for monitoring.
  3. Identify persons to lead and participate in
    monitoring.
  4. Identify resources available for monitoring
    locals.
  5. Determine process for collecting data with
    clearly defined criteria for rating conduct
    monitoring.
  6. Report findings and recommendations.
  7. Follow-up on results.

57
Data Help
  • Measure student progress
  • Measure program effectiveness
  • Assess instructional effectiveness
  • Guide curriculum development
  • Allocate resources wisely
  • Promote accountability
  • Report to funders and to the community
  • Meet state and federal reporting requirements
  • Show trends

58
BUT
  • Data do not help
  • If the data are not valid and reliable
  • If the appropriate questions are not asked after
    reviewing the data or
  • If data analysis is not used for making wise
    decisions.

59
A Word about the Change Process
  • Factors that allow us to accept change
  • There is a compelling reason to do so
  • We have a sense of ownership of the change
  • Our leaders model they are serious about
    supporting the change
  • We have a clear picture of what the change will
    look like and
  • We have organizational support for lasting
    systemic change.

60
Stages of Change
  1. Maintenance of the old system
  2. Awareness of new possibilities
  3. Exploration of those new possibilities
  4. Transition to some of those possibilities or
    changes
  5. Emergence of a new infrastructure
  6. Predominance of the new system

61
A Word of Caution
  • Start small dont overwhelm locals with a data
    dump.
  • Begin with the core issues, such as educational
    gain.
  • Listen to what the data tell about the big
    picture dont get lost in too many details.
  • Work to create trust and build support by laying
    data on the table without fear of recrimination.
  • Provide training opportunities for staff on how
    to use data.
  • Be patient, working with what is possible in the
    local program.
  • Source Spokane, WA School Superintendent Brian
    Benzel

62
Planning and Implementing Program Improvement
  • Stages of the Program Improvement Process
  • Planning
  • Implementing
  • Evaluating and
  • Documenting Lessons Learned and Making
    Adjustments, as needed

63
Planning Questions
  • Who should be included on your program
    improvement team?
  • How will you prioritize areas needing
    improvement?
  • How will you identify and select strategies for
    effecting improvement?

64
Guiding Questions for Strategies
  • Is the strategy
  • Clear and understandable to all users?
  • One specific action or activity, or dependent on
    other activities? (If so, describe the sequence
    of actions.)
  • An activity that will lead to accomplishing the
    goal?
  • Observable and measurable?
  • Assignable to specific persons?
  • Based on best practices?
  • One that all team members endorse?
  • Doableone that can be implemented?

65
Implementation Questions
  • Who will be responsible for taking the lead on
    ensuring that the change is implemented?
  • Who will be members of the change team and what
    will be their roles?
  • How will expectations for the change be promoted
    and nurtured?
  • How will the change be monitored?

66
Evaluation Questions
  • How will the changes that are implemented be
    evaluated?
  • How will the team ensure that both short- and
    long-term effects are measured?
  • Who will interpret the results?
  • Who will be on the look-out for unintended
    consequences?

67
Possible Evaluation Results
  • Significant improvement with no significant
    unintended consequences Stay the course.
  • Little or no improvement Stay the course OR
    scrap the changes?
  • A deterioration in outcomes Scrap the changes.

68
Documenting the Process
  • Document
  • what worked and what didnt
  • lessons learned and
  • logical next steps or changes to the plan.
  • Use as guide for future action.

69
State Planning Time
  • In your state teams, consider the questions
    on H-14 and begin planning.
  • Consider the stakeholders you want to include in
    your planning for data monitoring and program
    improvement.
  • Consider the problems you anticipate facing and
    propose solutions to those problems.
  • Complete H-14 to the best of your ability and be
    prepared to report on your plan in one hour.

70
Thank you
  • Great Audience!
  • Great Participation!
  • Great Ideas!
  • Live Long and Prosper!
  • Good Luck!!
Write a Comment
User Comments (0)
About PowerShow.com