TRANSITION: SCHOOL TO ADULT LIFE - PowerPoint PPT Presentation

1 / 48
About This Presentation
Title:

TRANSITION: SCHOOL TO ADULT LIFE

Description:

Tran. Assessment. Guide: 3831. Student Involvement: 227. Count ... Roll out of AIR & Sites Begin. April September. Interviewing 1-year. September 8. Data Due ... – PowerPoint PPT presentation

Number of Views:80
Avg rating:3.0/5.0
Slides: 49
Provided by: nan6166
Category:
Tags: adult | life | school | transition | air | tran

less

Transcript and Presenter's Notes

Title: TRANSITION: SCHOOL TO ADULT LIFE


1
TRANSITION SCHOOL TO ADULT LIFE
  • Status
  • Winter/Spring 2008
  • Indiana Department of Education

2
290 COMMITTEE
  • Membership
  • Family Advocate
  • IN-SIG
  • IDOE
  • Div. Of Mental Health
  • Vocational Rehabilitation
  • Bureau of Developmental Disabilities
  • Dept. Of Corrections
  • Dept. Of Workforce Development
  • Universities
  • Social Security Administration
  • IN-ARF
  • Post-Secondary Consultant

3
290 COMMITTEE
  • Employability Skills Rubric
  • Validity/Reliability Study Right now
  • 30 High Schools (5 students/2 adults 10)
  • One line input to be completed 3/14/08

4
290 COMMITTEE
  • Home and Community Based
  • Medicaid Waiver
  • Total wait list 17,500
  • Ages up to 22 9500 (54)
  • Total who have initiated services

35
5
290 COMMITTEE
  • VOCATIONAL REHABILITATION SERVICES
  • TRANSITION GRANTS
  • SYSTEMATIZING RESOURCES

6
TRANSITION IEP
  • Electronic IEP
  • Subject Matter Expert Group (SME)
  • Compliant with IDEA-04 / Article 7
  • Follows same process as the decision making flow
    chart
  • Product Documentation of the process results

7
290 COMMITTEE
  • OCTOBER/NOVEMBER 2007
  • STATEWIDE TRAINING
  • Thank you
  • Stakeholders (Development of content)
  • IIDC (Logistics)
  • School Personnel who attended

8
(No Transcript)
9
STATEWIDE TRAINING Fall 2007
10
STATEWIDE TRAINING Fall 2007
Count of IIDC Web Downloads
  • Presentation 3,687
  • Flowchart 262
  • Trans. Assessment definition 304
  • Tran. Assessment
  • Guide 3831
  • Student Involvement 227
  • Goal examples 451
  • SOP 217
  • HS vs College 745
  • Waiver 229
  • Other states SOP 299
  • FAQ SOP 109

11
TRAINING NEXT STEPS
  • All training resources on the IIDC website
  • http//www.iidc.indiana.edu/cclc/
  • Local personnel conducting training
  • Technical assistance available
  • Dr. Teresa Grossi
  • (812) 855-4070
  • tgrossi_at_indiana.edu

12
TRANSITION CONFERENCE
Paddling My Own Canoe
August 6 7, 2008
13

Critical Interrelationships
Staying in School
Quality Transition IEPs
Positive post-school outcomes
Graduating
A Model for Collaborative Technical Assistance
for SPP Indicators 1, 2, 13, 14
14
Indicator Synergy

1
2
13
14
HL
15
INDICATOR 13
  • 100 of IEPs for students with disabilities aged
    14 and above include coordinated, measurable,
    annual IEP goals and transition services that
    will reasonably enable the student to meet the
    post-secondary goals.

16
MONITORING (CI-13)
  • FFY 2006 (SY 06-07)
  • 17.90 compliant Transition IEPs
  • 5.1 LEAs had 100 compliant Transition IEPs
  • Process
  • Local file reviews (Minimum 5/Maximum 25)
  • Instrument based on NSTTAC, revised to match the
    Transition IEP flowchart.
  • Information sent to, then compiled by DEL

17
MONITORING (CI-13)
18
MONITORING (CI-13)
19
MONITORING (CI-13)
20
MONITORING (CI-13)
21
MONITORING (CI-13)
22
MONITORING (CI-13)
23
MONITORING (CI-13)
24
MONITORING (CI-13)
25
MONITORING (CI-14)
  • Percent of youth who had IEPs, are
  • no longer in secondary school
  • and who have been competitively
  • employed, enrolled in some
  • type of postsecondary school,
  • or both, within one year of
  • leaving high school.

26
MONITORING (PI-14)
  • FFY 2006 (SY06-07)
  • 2,699 individuals completed the survey process.
    (These 2,699 individuals were FFY 2005 (SY 05-06)
    exiting students).
  • Of these 2,699 individuals, 70.4, or 1,901 of
    the 2,699, were competitively employed, enrolled
    in some type of postsecondary school, or both.

27
MONITORING (PI-14)
28
MONITORING (PI-14)
  • March 2008
  • Completion Trainings-New Data Collection System
  • April 2008
  • Roll out of AIR Sites Begin
  • April September
  • Interviewing 1-year
  • September 8
  • Data Due
  • Mid October
  • Data delivery to LEAs

29
MONITORING (PI-1/PI-2)
  • Percent of youth with individualized education
    programs (IEPs) graduating from high school with
    a regular diploma compared to percent of all
    youth in the State graduating with a regular
    diploma.
  • Percent of youth with individualized education
    programs (IEPs) dropping out of high school
    compared to the percent of all youth in the State
    dropping out of high school.

Indicator 2
Indicator 1
30
MONITORING (PI-1) Results

31
MONITORING (PI-1) Results
32
MONITORING (PI-2) Results
33
MONITORING (PI-2) Results
34
Problem Solving Method
Defining the Problem Is there a
problem? What is it? How significant?
Evaluating Progress
Did the plan work?
What needs to happen next?
Analyzing the Problem Why is it happening?
Implementing the Plan with Fidelity
Determining What to Do
34
35
MONITORING
  • Bob Marra will be discussing in more detail
    Friday Morning
  • Multiple levels of monitoring
  • Data Collection
  • Data Verification
  • Data Analysis (Initial)
  • Data Analysis (Systemic Issues)

36
MONITORING
  • Data Analysis (Initial)
  • Synergy Indicator Clusters
  • Same 5 questions for all clusters
  • Tailored thought provoking questions for the 5
    questions
  • Stakeholders helped develop
  • In draft stage

37
MONITORING
  • Q1 Describe the characteristics of your local
    indicator data collection (complete an
    analysis)
  • a. Who is responsible for designing data
    collection in your state and/or local school or
    district for each of the indicators?
  • b. What are the information sources and how is
    the information collected for each of the
    indicators?
  • c. Who is responsible for collecting the data?
  • d. Who is responsible for analyzing the data?
  • e. How good are the data in terms of reliability?
    Validity? Response rate?  

38
MONITORING
  • Example Q 1 Tailored Questions
  • Is the data being collected consistently?
  • Are there too many people collecting the data
    silos being created?
  • Who sees the data?
  • Who should be seeing the data?

39
MONITORING
  • Q2 As you reviewed your school or districts
    data collection (sufficient and quality/accuracy)
    do you need to look for more? Did questions about
    data collection emerge for which you want to seek
    answers? If so, list your questions as they
    pertain to this cluster of indicators.  

40
MONITORING
  • Example Q 2 Tailored Questions
  • Look at behavior data?
  • Does the data need to be disaggregated?
  • Gender?
  • Diploma?
  • Exceptionality?
  • Was the family involved in the Transition IEP
    development?

41
MONITORING
  • Q3 Describe your school or districts
    performance on the cluster of indicators.
    Highlight areas that need improvement which could
    include consideration of instruction/
    intervention, assessment/progress monitoring,
    data based problem solving, LEA leadership,
    family involvement, and cultural responsivity.

42
MONITORING
  • Example Q 3 Tailored Questions
  • Do we offer curriculum that meets the needs of
    all students to encourage fewer drop-outs?
  • How soon after an issue is IDed is the family
    involved in problem solving?
  • Do S/W/D have access to instruction /intervention
    that all students have?

43
MONITORING
  •  Q4 As you reviewed your school or districts
    performance (trends and patterns), what questions
    emerge about performance you want to seek
    answers? List your questions as they pertain to
    each cluster of indicators. 

44
MONITORING
  • Example Q 4 Tailored Questions
  • Did we meet the requirements for all
    exceptionalities?
  • Are we doing well in any cluster of categories?
  • Is it one building? One Exceptionality? One
    Program?

45
MONITORING
  • Q5 As you reviewed your school or districts
    performance, describe actions now necessary to
    address issues (instruction/intervention,
    assessment/ progress monitoring, data based
    problem solving, LEA leadership, family
    involvement, and cultural responsivity).

46
MONITORING
  • Example Q 5 Tailored Questions
  • How is information shared?
  • Effective?
  • Is training consistent?
  • What is criteria for interventions?
  • What staff development has occurred?
  • How is progress being monitored?

47
MONITORING
  • Incorporate the following categories into your
    improvement activities
  • Provide training/professional development
  • Improve data collection
  • Improve systems administration and monitoring
  • Improve collaboration/ coordination
  • Program development
  • Clarify/examine/develop policies and procedures
  • Provide technical assistance
  • Evaluation.

48
(No Transcript)
Write a Comment
User Comments (0)
About PowerShow.com