Data Analysis for Program Improvement - PowerPoint PPT Presentation

1 / 28
About This Presentation
Title:

Data Analysis for Program Improvement

Description:

Reliability means that you should get the same results if you were to collect ... themselves, but to the appropriateness and adequacy of interpretations made ... – PowerPoint PPT presentation

Number of Views:88
Avg rating:3.0/5.0
Slides: 29
Provided by: CCTC
Category:

less

Transcript and Presenter's Notes

Title: Data Analysis for Program Improvement


1
Data Analysis for Program Improvement
  • Gay Roby Teri Clark
  • BTSA Directors Meeting
  • May 2003

2
Rule 1 Data Analysis Interpretation
  • Your conclusions will be no better than your
    data.
  • Garbage In, Garbage Out.

3
Desirable Data Characteristics
  • Reliability
  • Validity

4
Reliability
  • Reliability refers to the consistency ofresults.
  • Slightly modified from Guskey (2000), p. 236
  • Reliability means that you should get the same
    results if you were to collect
  • similar data at the same time with a different
    method
  • the same data by another person
  • collect the data at a later time

5
Threats to Reliability of Conclusions
  • Vague questions
  • Lack of connection to experience
  • Interviewing with a pre-conceived outcome in mind
  • Too small or too biased a sample

6
Validity
  • Validity refers not to data instruments
    themselves, but to the appropriateness and
    adequacy of interpretations made from the
    instruments
  • validity is always specific to a particular
    interpretation or use. No data collection
    instrument is valid for all purposes.
  • validity is a matter of degree. It does not
    exist on an all-or-none basis.
  • Modified from Guskey (2000), pp. 235-6

7
Threats to Validity of Conclusions
  • Too narrow a focus
  • Too broad a focus
  • Results are actually from an unmeasured variable

8
Statistical Measures
  • Mean Average
  • Standard Deviation (SD)
  • Spread of responses

9
Two Patterns of Responses
  • 1 2
  • 1 1 1 1 1 2 2 3 3 3
  • 1 1 1 1 1 3 3 3 3 3
  • 1 1 1 1 1 3 3 3 3 3
  • 5 5 5 5 5 3 3 3 3 3
  • 5 5 5 5 5 3 3 3 3 3
  • 5 5 5 5 5 3 3 3 4 4
  • Mean 3 Mean 3

What is the difference between the two patterns?
Would you care which pattern produced your
mean scores? Why?
10
Two Patterns of Responses
  • 1 2
  • 1 1 1 1 1 2 2 3 3 3
  • 1 1 1 1 1 3 3 3 3 3
  • 1 1 1 1 1 3 3 3 3 3
  • 5 5 5 5 5 3 3 3 3 3
  • 5 5 5 5 5 3 3 3 3 3
  • 5 5 5 5 5 3 3 3 4 4
  • Mean 3 Mean 3
  • SD 2 SD .13

2/3 of the responses are within 1 SD of the
mean over 90 are within 2 SDs.
11
Analyzing Data
  • Sample response format
  • Not at
  • All Fully
  • 1 2 3 4
  • What message do you intend to send by responding
    at each score point?

12
Question
  • How satisfied with programs use of evaluation
    data
  • Means
  • Program State
  • SPs 2.84 3.09
  • SAs 3.23 3.02
  • Staff 3.58 3.44
  • What conclusions do you draw from the comparisons
    of the program and state means?

13
Meaning of Means
  • What are the implications for you and your
    program for each of the following?
  • Note Four-point scale
  • Survey Item Program Mean State Mean
  • Satisfaction with
  • support for BT SP BT SP
  • working with English learners 3.5 3.7 2.2 2.3
  • classroom management 3.7 3.9 3.5 3.6
  • using computer technology 1.7 1.9 1.5 1.7
  • assessing student learning 1.7 1.3 3.2 3.5

14
Differences between Data/Evidence, Analysis, and
Actions
  • Data/evidence Raw findings, e.g., mean, SD,
    comparison of subgroup means summaries of
    interviews with indication of extent of
    frequency/agreement
  • Interpretation/ Speculations/hunches about the
  • Analysis meaning of the data
  • Actions Changes to be made in the program as a
    result of the analysis

15
What are They Trying to Tell Us?CFASST has too
much paperwork.
  • Interpretation
  • Not enough time
  • Not meaningful
  • Not as important as other uses of time
  • Action
  • Reduce amount of work
  • Increase frequency or length of SP/BT meetings
  • Improve questions
  • Be explicit about intent with SPs
  • Model how CFASST is used to improve practice
  • Work on BT prioritization of time

16
Data Analysis and Action
  • Data/Evidence Most beginning teachers
    participating in focus group interviews do not
    believe that professional development is
    necessarily geared toward their needs.
  • Interpretation/Analysis
  • Action

17
Data Analysis
  • Data analysis tends to be like Matruska dolls
    you think about a result and wish for more data
    to help you understand it.
  • Triangulation of data from different types of
    evidence and knowledgeable groups is best.
  • Triangulation of data is also time-consuming
    and/or expensive, so it needs to be used
    strategically.

18
Local Evaluations
  • How do you design them?
  • How do you use them?
  • When do you provide them?

19
What does the Standard 4 Evaluation say?
  • Element a
  • Local program goals and the induction program
    standards are the criteria for program evaluation
  • Element b
  • . . . include info from multiple internal and
    external sources
  • Element c
  • . . . regularly collects feedback about program
    quality and effectiveness from all participants

20
1. Use Backwards Planning
  • What do you want to know?
  • When do you want to know it?
  • From whom do you need to ask information?

21
For example
  • Standard 8 Element d)
  • . . . considers input from the participating
    teacher in pairing the support provider with the
    participating teacher
  • Who participating teachers
  • When about 6-8 weeks into their program
  • and at the end of the first year
  • Why to alter matches that are not working

22
2. Put Questions in a Layout Grid
23
3. Identify the standard they address with a
superscript
  • Would you like to remain with the same support
    provider next year? 8d

24
4. Go through all the standards for all
stakeholders
25
5. Decide when to survey whom
  • Determine how often you want to
    survey--semiannually, quarterly, bimonthly,
    monthly
  • Group questions accordingly
  • Color code your survey paper
  • Include a return date
  • Make it as easy as possible for them to return
    the survey to you

26
6. Stop and analyze results
  • At the end of your school year cycle, take
    the time to critically analyze your questions and
    results for validity and reliability.
  • Did the folks understand the question? Did you
    get what you asked for?
  • Did your questions provide you with results that
    you could use in program implementation?
  • Do you need different information and therefore,
    need to rewrite this question or ask a different
    question?
  • Are your results compatible with last years
    results? If not, why not?

27
Lets practice!
  • Use the Standards booklets provided
  • Each table will design questions for a
    specific stakeholder group
  • --identify when the question needs to be asked
  • --identify from which standard and element the
    question has been taken with a superscript
  • Each table will present their questions to the
    group (time permitting).
  • Results will be compiled and sent out to
    participants of this session/cluster leadership

28
Site Administrator Questions
Write a Comment
User Comments (0)
About PowerShow.com