Looking at Data - PowerPoint PPT Presentation

1 / 58
About This Presentation
Title:

Looking at Data

Description:

Early Childhood Outcomes Center. 3. Evidence. Evidence refers to the numbers, such as ' ... Early Childhood Outcomes Center. 5. Inference ... – PowerPoint PPT presentation

Number of Views:32
Avg rating:3.0/5.0
Slides: 59
Provided by: fpg6
Learn more at: https://fpg.unc.edu
Category:
Tags: center | data | looking

less

Transcript and Presenter's Notes

Title: Looking at Data


1
  • Looking at Data

2
Using data for program improvement EIA
  • Evidence
  • Inference
  • Action

3
Evidence
  • Evidence refers to the numbers, such as
  • 45 of children in category b
  • The numbers are not debatable

4
Inference
  • How do you interpret the s?
  • What can you conclude from the s?
  • Does evidence mean good news? Bad news? News we
    cant interpret?
  • To reach an inference, sometimes we analyze data
    in other ways (ask for more evidence)

5
Inference
  • Inference is debatable -- even reasonable people
    can reach different conclusions from the same set
    of numbers
  • Stakeholder involvement can be helpful in making
    sense of the evidence

6
Action
  • Given the inference from the numbers, what should
    be done?
  • Recommendations or action steps
  • Action can be debatable and often is
  • Another role for stakeholders

7
What can we infer?
  • Poll results A
  • Candidate I.M. Good 51, Candidate R.U. Kidding
    49 ( or 3)
  • Poll results B
  • Candidate I.M. Good 56, Candidate R.U. Kidding
    44 ( or 3)

8
Program improvement Where and how
  • At the state level TA, policy
  • At the regional or local level supervision,
    guidance
  • Classroom level -- spend more time on certain
    aspects of the curriculum
  • Child level -- modify intervention

9
Key points
  • Evidence refers to the numbers and the numbers by
    themselves are meaningless
  • Inference is attached by those who read
    (interpret) the numbers
  • You have the opportunity and obligation to attach
    meaning

10
E I A Jeopardy
100
100
100
200
200
200
300
300
300
11
Use of Data
  • Activity
  • Evidence-Inference-Action

12
Continuous Program Improvement
Reflect Are we where we want to be?
Check (Collect and analyze data)
Plan (vision) Program characteristics Child and
family outcomes
Implement
13
Tweaking the System
Is there a problem?
Reflect Are we where we want to be?
Why is it happening?
Is it working?
What should be done?
Check (Collect and analyze data)
Plan (vision) Program characteristics Child and
family outcomes
Implement
Is it being done?
14
Continuous means
  • .the cycle never ends.

15
Outcome questions for program improvement, e.g.
  • Who has good outcomes
  • Do outcomes vary by
  • Region of the state?
  • Level of functioning at entry?
  • Services received?
  • Age at entry to service?
  • Type of services received?
  • Family outcomes?
  • Education level of parent?

16
Examples of process questions
  • Are ALL services high quality?
  • Are ALL children and families receiving ALL the
    services they should in a timely manner?
  • Are ALL families being supported in being
    involved in their childs program?
  • What are the barriers to high quality services?

17
Working Assumptions
  • There are some high quality services and programs
    being provided across the state.
  • There are some children who are not getting the
    highest quality services.
  • If we can find ways to improve those
    services/programs, these children will experience
    better outcomes.

18
Numbers as a tool
  • Heard on the street
  • Why are we reducing children to a number?
  • So why do we need numbers?

19
(No Transcript)
20
(No Transcript)
21
(No Transcript)
22
(No Transcript)
23
Examining COSF data at one time point
  • One group - Frequency Distribution
  • Tables
  • Graphs
  • Comparing Groups
  • Graphs
  • Averages

24
Distribution of COSF Ratings in Fall
We are using fake data for illustration
25
Frequency on Outcome 1 - Fall
26
Frequency on Outcome 1 - Fall
27
Comparison of two classes - Fall
28
Frequency on Outcome 1 - Fall
29
Frequency on Outcome 1 Class 1
30
Average Scores on Outcomes by Class Fall, 2008
31
Average Scores on Outcomes by Class Fall, 2008
32
Average Scores on Outcomes by Class Fall, 2008
33
Looking at change over time
  • Extent of change on rating scale
  • The OSEP categories
  • Developmental trajectories
  • Maintaining
  • Changing

34
Extent of change on rating scale Time 1 to Time
2
35
OSEP progress categories
  • Looking at information across time
  • Reducing the information to fewer categories to
    allow easier comparisons

36
(No Transcript)
37
(No Transcript)
38
(No Transcript)
39
(No Transcript)
40
Working with data
  • Different levels of analysis are required for
    different levels of questions
  • Aggregation will work for you but loses detail
    about individual children.
  • 50 assessment items on 20 children in 5 classes
    in Fall and Spring
  • 50 x 20 x 5 x 2 10,000 pieces of information

41
Using assessment data at the classroom level
  • Looking at the data by child
  • At a single point in time
  • Over time
  • Looking at data for areas that cut across
    children
  • At a single point in time
  • Over time

42
Example Item Results for 5 Imaginary Children
AAccomplished E Emerging NY Not yet
43
Example COSF Outcome Ratings for Class 3c by
Child
44
Example of an Aggregated Report for Program
Percentage of Children Scoring 5 or Higher on
COSF by Class
What do you see in these data?
45
Outcome questions for program improvement, e.g.
  • Who has good outcomes
  • Do outcomes vary by
  • Region of the state?
  • Level of functioning at entry?
  • Services received?
  • Age at entry to service?
  • Type of services received?
  • Family outcomes?
  • Education level of parent?

46
Looking at Data by Region
Percentage of Children Who Changed Developmental
Trajectories After One Year of Service
Possible inference?
47
Looking at Data by Age at Entry
Percentage of Children Who Changed Developmental
Trajectories After One Year of Service
Possible inference?
48
Take Home Message
  • You will want to look at your data in lots of
    different ways
  • You will want to think about the possible
    inferences
  • You may need other information to decide among
    possible inferences
  • Act on what you have learned

49
Tweaking the System
Is there a problem?
Reflect Are we where we want to be?
Why is it happening?
Is it working?
What should be done?
Check (Collect and analyze data)
Plan (vision) Program characteristics Child and
family outcomes
Implement
Is it being done?
50
How will/might these data be used?
  • Federal level
  • Overall funding decisions (accountability)
  • Resource allocation (e.g., what kind of TA to
    fund?)
  • Decisions about effectiveness of program in
    individual states
  • State level
  • Program effectiveness??
  • Program improvement??
  • Local level
  • Program improvement??

51
Need for good data
  • Encompasses all three levels federal, state,
    local
  • Depends on how well local programs are
    implementing procedures

52
Many steps to ensuring quality data
53
Take Home Message
  • If you conclude the data are not (yet) valid,
    they cannot be used for program effectiveness,
    program improvement or anything else.
  • Inference Data not yet valid
  • Action Continue to improve data collection and
    quality assurance

54
Data Exploration
  • Examine the data to look
    for inconsistencies
  • If and when you find something strange, look for
    some other data you have that might help explain
    it. Is the variation caused by something other
    than bad data?

55
Obtaining good data
  • Focus on addressing the threats to good data
  • Local providers do not understand the procedures
  • Local providers do not follow the procedures
  • And others..
  • Identify and address the threats

56
How far along is our state?
57
  • Keeping our eye on the prize
  • High quality services for children and families
    that will lead to good outcomes.

58
  • For more information.
  • www.the-eco-center.org
Write a Comment
User Comments (0)
About PowerShow.com