Title: Looking at Data
1 2Using data for program improvement EIA
- Evidence
- Inference
- Action
3Evidence
- Evidence refers to the numbers, such as
- 45 of children in category b
- The numbers are not debatable
4Inference
- How do you interpret the s?
- What can you conclude from the s?
- Does evidence mean good news? Bad news? News we
cant interpret? - To reach an inference, sometimes we analyze data
in other ways (ask for more evidence)
5Inference
- Inference is debatable -- even reasonable people
can reach different conclusions from the same set
of numbers - Stakeholder involvement can be helpful in making
sense of the evidence
6Action
- Given the inference from the numbers, what should
be done? - Recommendations or action steps
- Action can be debatable and often is
- Another role for stakeholders
7What can we infer?
- Poll results A
- Candidate I.M. Good 51, Candidate R.U. Kidding
49 ( or 3) - Poll results B
- Candidate I.M. Good 56, Candidate R.U. Kidding
44 ( or 3) -
8Program improvement Where and how
- At the state level TA, policy
- At the regional or local level supervision,
guidance - Classroom level -- spend more time on certain
aspects of the curriculum - Child level -- modify intervention
9Key points
- Evidence refers to the numbers and the numbers by
themselves are meaningless - Inference is attached by those who read
(interpret) the numbers - You have the opportunity and obligation to attach
meaning
10E I A Jeopardy
100
100
100
200
200
200
300
300
300
11 Use of Data
- Activity
- Evidence-Inference-Action
12Continuous Program Improvement
Reflect Are we where we want to be?
Check (Collect and analyze data)
Plan (vision) Program characteristics Child and
family outcomes
Implement
13Tweaking the System
Is there a problem?
Reflect Are we where we want to be?
Why is it happening?
Is it working?
What should be done?
Check (Collect and analyze data)
Plan (vision) Program characteristics Child and
family outcomes
Implement
Is it being done?
14Continuous means
15Outcome questions for program improvement, e.g.
- Who has good outcomes
- Do outcomes vary by
- Region of the state?
- Level of functioning at entry?
- Services received?
- Age at entry to service?
- Type of services received?
- Family outcomes?
- Education level of parent?
16Examples of process questions
- Are ALL services high quality?
- Are ALL children and families receiving ALL the
services they should in a timely manner? - Are ALL families being supported in being
involved in their childs program? - What are the barriers to high quality services?
17Working Assumptions
- There are some high quality services and programs
being provided across the state. - There are some children who are not getting the
highest quality services. - If we can find ways to improve those
services/programs, these children will experience
better outcomes.
18Numbers as a tool
- Heard on the street
- Why are we reducing children to a number?
- So why do we need numbers?
19(No Transcript)
20(No Transcript)
21(No Transcript)
22(No Transcript)
23Examining COSF data at one time point
- One group - Frequency Distribution
- Tables
- Graphs
- Comparing Groups
- Graphs
- Averages
24Distribution of COSF Ratings in Fall
We are using fake data for illustration
25Frequency on Outcome 1 - Fall
26Frequency on Outcome 1 - Fall
27Comparison of two classes - Fall
28Frequency on Outcome 1 - Fall
29Frequency on Outcome 1 Class 1
30Average Scores on Outcomes by Class Fall, 2008
31Average Scores on Outcomes by Class Fall, 2008
32Average Scores on Outcomes by Class Fall, 2008
33Looking at change over time
- Extent of change on rating scale
- The OSEP categories
- Developmental trajectories
- Maintaining
- Changing
34Extent of change on rating scale Time 1 to Time
2
35OSEP progress categories
- Looking at information across time
- Reducing the information to fewer categories to
allow easier comparisons
36(No Transcript)
37(No Transcript)
38(No Transcript)
39(No Transcript)
40Working with data
- Different levels of analysis are required for
different levels of questions - Aggregation will work for you but loses detail
about individual children. - 50 assessment items on 20 children in 5 classes
in Fall and Spring - 50 x 20 x 5 x 2 10,000 pieces of information
41Using assessment data at the classroom level
- Looking at the data by child
- At a single point in time
- Over time
- Looking at data for areas that cut across
children - At a single point in time
- Over time
42Example Item Results for 5 Imaginary Children
AAccomplished E Emerging NY Not yet
43Example COSF Outcome Ratings for Class 3c by
Child
44Example of an Aggregated Report for Program
Percentage of Children Scoring 5 or Higher on
COSF by Class
What do you see in these data?
45Outcome questions for program improvement, e.g.
- Who has good outcomes
- Do outcomes vary by
- Region of the state?
- Level of functioning at entry?
- Services received?
- Age at entry to service?
- Type of services received?
- Family outcomes?
- Education level of parent?
46Looking at Data by Region
Percentage of Children Who Changed Developmental
Trajectories After One Year of Service
Possible inference?
47Looking at Data by Age at Entry
Percentage of Children Who Changed Developmental
Trajectories After One Year of Service
Possible inference?
48Take Home Message
- You will want to look at your data in lots of
different ways - You will want to think about the possible
inferences - You may need other information to decide among
possible inferences - Act on what you have learned
49Tweaking the System
Is there a problem?
Reflect Are we where we want to be?
Why is it happening?
Is it working?
What should be done?
Check (Collect and analyze data)
Plan (vision) Program characteristics Child and
family outcomes
Implement
Is it being done?
50How will/might these data be used?
- Federal level
- Overall funding decisions (accountability)
- Resource allocation (e.g., what kind of TA to
fund?) - Decisions about effectiveness of program in
individual states - State level
- Program effectiveness??
- Program improvement??
- Local level
- Program improvement??
51Need for good data
- Encompasses all three levels federal, state,
local - Depends on how well local programs are
implementing procedures
52Many steps to ensuring quality data
53Take Home Message
- If you conclude the data are not (yet) valid,
they cannot be used for program effectiveness,
program improvement or anything else. - Inference Data not yet valid
- Action Continue to improve data collection and
quality assurance
54Data Exploration
- Examine the data to look
for inconsistencies - If and when you find something strange, look for
some other data you have that might help explain
it. Is the variation caused by something other
than bad data?
55Obtaining good data
- Focus on addressing the threats to good data
- Local providers do not understand the procedures
- Local providers do not follow the procedures
- And others..
- Identify and address the threats
56How far along is our state?
57- Keeping our eye on the prize
- High quality services for children and families
that will lead to good outcomes.
58- For more information.
- www.the-eco-center.org