State Career and Technical Education SelfAssessment - PowerPoint PPT Presentation

1 / 69
About This Presentation
Title:

State Career and Technical Education SelfAssessment

Description:

Bernie McInerney, Tech Prep Coordinator, New York State Education Department ... Bernie McInerney, Tech Prep Coordinator. bmcinern_at_mail.nysed.gov 518-474-4157 ... – PowerPoint PPT presentation

Number of Views:32
Avg rating:3.0/5.0
Slides: 70
Provided by: sue8235
Category:

less

Transcript and Presenter's Notes

Title: State Career and Technical Education SelfAssessment


1
State Career and Technical Education
Self-Assessment
  • Tom Kelsh, Senior Research Associate, MAGI
    Educational Services, Inc.
  • Karen Batchelor, State Director for Career and
    Technology Education, Texas Education Agency
  • Dan Covington, Director for Physical Information
    Management, Tennessee Department of Education
  • Bernie McInerney, Tech Prep Coordinator, New York
    State Education Department
  • Kathy Shibley, State Director of Career-Technical
    and Adult Education, Ohio Department of Education

2
STATE CAREER and TECHNICAL EDUCATION (CTE)
SELF-ASSESSMENT  
Assessing the Progress and Future Planning of
CTE A Self-Assessment Tool for State Agencies
Created by
3
Purpose
The State Career and Technical Education (CTE)
Self-Assessment is a comprehensive, voluntary
instrument designed to help guide states program
improvement efforts.
4
Purpose (continued)
The instrument identifies many activities, tasks,
processes, and collaborations that, if they occur
consistently, ensure that CTE programs are being
implemented with a high degree of quality.
5
Purpose (continued)
By using this tool in a dynamic, ongoing way,
states can identify many existing CTE practices
and policies that comprise quality and use them
as building blocks for system-wide continuous
improvementfrom properly administering their
basic grants and tech-prep programs to using
their accountability data to fund local programs.
6
Purpose (Continued)
The process of the self-assessment also provides
intangible value beyond any written reports or
assessments because it
  • builds commitment and ownership on the part of
    the state-level staff who participate in the
    process
  • promotes team building and consensus among state
    CTE leaders
  • increases the capacity for strategic thinking in
    the field of CTE and
  • builds an understanding of what the federal
    government requires of states with respect to
    quality performance.

7
Purpose (Continued)
And finally, state CTE teams who engage in CTE
self-assessment prior to the OVAE monitoring
visit will be better prepared to take full
advantage of the exchange of ideas and technical
assistance provided. They will have considered
the views of key stakeholders, assembled and
digested information on the different components
of CTE, and come to a consensus on the current
status of their statewide efforts.
8
Directions for Use
  The CTE Self-Assessment asks state agencies to
rate their CTE programs according to 30 quality
indicators. The ratings should take into account
the various pieces of evidence that define each
indicator. A five-point rating scalerepresenting
a continuum of implementation progresshas been
developed and is described below.
9
Recommended Steps
The following steps are recommended to conduct
the state CTE self-assessment.
1.  Identify and recruit the key CTE stakeholders
to complete the self-assessment. A variety of
approaches to conducting this step can prove
effective. Regardless of the approach used,
however, it is important to enlist input from key
stakeholder groups.
10
Recommended Steps (continued)
2. Gather supporting evidence and data. The
instrument should be completed by knowledgeable
stakeholders who use as much supporting evidence
as possible. Sources of information can include
the state plan, reports, minutes of meetings,
mission/vision statements, policies, written
documentation and data gathered through
interviews with stakeholders, student records,
program site visits, third-party evaluation
evidence, financial records, proposals, local
applications, monitoring tools, the states
professional development plan, progress reports,
and so forth.
11
Recommended Steps (continued)
2. Gather supporting evidence and data
(continued). Examples of information sources
for each indicator can be found in the document,
State CTE Self-Assessment Sample Data Sources,
beginning on page 40.
12
Recommended Steps (continued)
 3. Complete the self-assessment. Carefully
read the evidence for each indicator. If the
evidence is in place (i.e., implemented), place a
checkmark (?) in the box provided. If you feel
that your level of implementation is systematic,
without significant weaknesses or gaps, place an
asterisk () next to the checkbox. Then review
these individual assessments and decide on a
final rating for the indicator fill in the
appropriate circle ? ? ? ?
?. Use the Notes section to record any
explanatory or expanded information about the
states performance for that indicator. Once you
have rated all of the indicators in each major
CTE area, transfer your ratings to the Summary
Form, beginning on page 36.
13
Recommended Steps (continued)
4. Provide feedback to CTE stakeholders
involved in Step 1. Throughout the process of
completing the CTE self-assessment, information
should be fed back to the key CTE stakeholders as
part of this dynamic process of inquiry and
reflection.  
14
Optional Uses
The primary use of the CTE self-assessment is to
help guide states program improvement efforts
through careful study of statewide policies,
procedures, and activities. However, a number of
states have found it helpful to use the tool in
other, creative ways
  •  as a monitoring tool for reviewing local grantee
    programs, functions, as well as expenditures and
  • an instructional device for orienting new staff
    (or re-acquainting veterans) about what comprises
    quality in the delivery of CTE programs.
  • Still others have used it
  • as a way of communicating the importance of
    Perkins/CTE to other non-CTE state-level
    stakeholders.

15
1. State Administration
Quality Indicator 1.1 Mission
NOTES (evidence of accomplishments, related
data/criteria, key stakeholders involved,
critical issues, Web site, etc.)
16
Final Rating Summary Form
17
STATE CTE SELF-ASSESSMENT SAMPLE DATA SOURCES
18
NCCTE Webcast Panel State CTE Self Assessment
New York States PerspectiveBernie
McInerney, Tech Prep Coordinator New York State
Education Department
  • Ohio State UniversityColumbus, OHMarch 27, 2006

19
NYS CTE Perspective
  • US Department of Education
  • Office of Vocational and Adult Education (OVAE)
    Document Decision
  • Compliance Perkins Monitoring Checksheets
  • Quality CTE Tech Prep Self Assessment Tool

20
NYS Perkins Team Stages
  • Participation in Self Assessment Tool Conference
    Calls
  • Review of Perkins Monitoring Checklists
  • Decision Monitoring Checksheets or Self
    Assessment Tool
  • Build NYS Perkins Review website on Monitoring
    Checksheets
  • Self Assessment Tool as a compliment for reference

21
NYS Perkins Team Stages(continued Post OVAE
Monitoring Visit)
  • A user-friendly Pilot Survey developed using
    the Self Assessment Tool - beyond the boxes
  • Target Audiences for the Pilot Survey
  • key State Perkins Team staff,
  • Tech Prep Consortia, and
  • a smattering of CTE Directors
  • Survey Results for outcome possibilities

22
Perkins Compliance Checksheets
23
(No Transcript)
24
(No Transcript)
25
Compliance OutcomeUSDE Monitoring Report
  • Perkins Monitoring Review July 2005 went very
    well with only a few compliance issues that were
    quickly resolved

26
Quality InitiativePOSTUSDE Monitoring Report
  • NYS CTE Tech Prep Pilot Online Survey with Self
    Assessment Tool

27
Survey Formscroll to bottom!
28
Designed so individual areas can be chosen
29
Notes Sectionsare providedfor feedback
30
NYS CTE Self AssessmentProcess and Outcomes
  • Strategies being surveyed in NYS
  • use in lieu of Perkins Monitoring Checklist
  • tool for reviewing local grantee programs
  • incorporate into our State/Local Plans or final
    reports, e.g.
  • narrative for the annual Perkins Consolidated
    Assessment Report (CAR) to OVAE
  • orientation instrument for new staff
  • in-service instrument for experienced staff

31
CTE Self Assessment Tool Process and
Outcomes(continued)
  • use to compliment improvement planning and
    implementation with regional accreditation
    organizations in postsecondary institutions
  • modified tool for local grantee programs use ,
    and
  • learning other strategies from NCCTE Webcast
    Panelists from Ohio, Tennessee and Texas!!!!!

32
Thank you
  • New York State Education Department Bernie
    McInerney, Tech Prep Coordinator
    bmcinern_at_mail.nysed.gov 518-474-4157
  • Pilot Online Survey Form can be found at

http//www.emsc.nysed.gov/workforce/techprep/tech.
html
33
Ohios Experience withState CTE Self-Assessment
Kathy Shibley, Ph.D. Director Office of
Career-Technical and Adult Education Ohio
Department of Education March 27, 2006
34
Relationship to Monitoring
  • Timing
  • Sequencing with monitoring visit
  • Volume of time required
  • Alignment with monitoring checklist
  • Benefits of definition

35
Future Uses
  • State Plan
  • Monitoring implementation
  • Mid-monitoring check

36
State CTE Self Assessments
Our Children Are Our Future No Child Left Behind
  • Karen Batchelor
  • Texas Education Agency

37
Performance-based Monitoring (PBM) System
  • Performance Based
  • Data Driven
  • 2004-2005 CTE Pilot Year
  • PBM district reports for CTE concentrators during
    03-04
  • Intervention stages - based on the number of
    indicators below state standards (1-2-3-4)

38
Texas Assessment of Knowledge and Skills (TAKS)
  • Student academic performance in
  • Math
  • Reading/ELA
  • Science
  • Social Studies

39
PBM Indicators for CTE Concentrators
  • CTE overall performance on TAKS
  • CTE SPED TAKS
  • CTE LEP TAKS
  • CTE ED TAKS
  • CTE Tech-Prep TAKS
  • CTE Annual Dropout Rate
  • Total of 21 CTE indicators

40
CTE Report Only Measures
  • RHSP/DAP Graduation Rate CTE students earning a
    recommended or distinguished achievement diploma
  • CTE Non-traditional course completion (males)
  • CTE Non-traditional course completion (females)

41
PBM Standards
Performance level 1 ? 1-5 below
standard Performance level 2 ? 5.1-10 below
standard Performance level 3 ? 10.1 below
standard
42
CTE PBM Summary
  • 2004-05 Stage 1 ? 67 districts
  • Stage 2 ? 23 districts
  • Stage 3 ? 13 districts
  • Stage 3 ? 3 districts
  • Stage 4 ? 21districts
  • 24 on-site visits
  • 2005-06 Stage 1 ? 158 districts
  • Stage 2 ? 30 districts
  • Stage 3 ? 20 districts
  • Stage 4 ? 26 districts
  • 26 on-site visits

43
District PBM Review Team
  • CTE district/campus administrator
  • Parent of CTE student
  • CTE teacher
  • CTE student
  • Guidance counselor
  • Business/industry partner
  • Other team members as desired

44
Intervention Stages
  • Stage 1
  • Program Review and Improvement Plan
  • Stage 2
  • Focused Data Analysis
  • Program Effectiveness Review/self study
  • Continuous Improvement Plan
  • Stage 3
  • Full Compliance Review
  • Stage 4
  • Full Compliance Review
  • CTE/ Civil Rights On-site Review

45
Program Effectiveness Review(based on Perkins
State Self Study)
  • Administrative Leadership
  • Local Perkins Application/Plan
  • Tech-Prep/Advanced Technical Credit
  • Special Populations
  • Civil Rights (CR)
  • Fiscal Management
  • Accountability

46
Modifying the State Self Study to Develop the
Program Effectiveness Review
  • Customized for LEA
  • Used Indicators only (no evidence)
  • Added Civil Rights indicators
  • Added yes/no for each indicator
  • Column for identifying Strengths
  • Column for Areas of Improvement

47
CTE Web Resources
Performance Based Monitoring www.tea.state.tx.us/
pbm
Program Monitoring and Intervention www.tea.state
.tx/us/pmi
48
Tennessee Self Assessment Process
  • Dan Covington
  • Director, Fiscal and Information Management
  • Tennessee Department of Education
  • dan.covington_at_state.tn.us

49
OVAE Targeted Monitoring
  • September Notification of Targeted Monitoring
    Visit
  • Tennessee Targeted Monitoring Visit -December 1
    2, 2005
  • Preparing for the On-sight Review
  • OVAE recommended that we
  • Collect Documentation for the Six Topical Areas
  • Complete the Perkins Self Assessment Tool

50
Perkins Self Assessment Tool
  • Why we did it?
  • Tennessee had experienced staff changes within
    the Division
  • The current staff members were not at the SDE
    when Tennessee was monitored in 2002
  • The tool presented a unique data collection
    process to ascertain depth and quality of programs

51
Preparing for the Perkins Self Assessment
  • 16 Stakeholders were Identified
  • Assistant Commissioner(1)
  • Department Directors(3)
  • Program Area Consultants(2)
  • CTSO Consultants(1)
  • Field Service Consultants(2)
  • Vocational Directors(LEA)(3)
  • TCOVE Executive Director(1)
  • Postsecondary, TTC CC(3)

52
Focusing on Continuous Program Improvement
  • Survey Methodology
  • Representative team of stakeholders including
    secondary and postsecondary identified
  • Assessment was completed by each individual
    stakeholder
  • Results were compiled and percentages and
    comments were rated

53
Why the Self-Assessment Survey?
  • Assess the level of compliance with Perkins III
    legislation for the Quality Indicators
  • Assess the depth and quality of our career and
    technical programs as we continue our 20/20
    visioning process for program improvement

54
Why the Self Assessment Survey?
  • Assist with the visioning process
  • Determine where we were on each Quality Indicator
  • Begin a validation process for OVAE Monitors
  • Assess where we needed to be

55
Survey Results Analysis
  • Highest Rating Order
  • Average of Responses for
  • Complete or Exemplary
  • Local Application
  • Fiscal Responsibility
  • State Administration

56
Survey Results Analysis
  • Lowest Rating Order
  • Average of Responses for
  • No, Minimal, or Moderate Implementation
  • Tech Prep
  • Special Populations
  • Accountability

57
High Ratings Analysis
  • Division Initiatives
  • Submission of automated local applications
  • Submission of on-line accountability data
  • Initiated risk based monitoring processes
  • Building staff capacity
  • Strengthening fiscal management processes
    (FACTS,CATS, Staff reassignment)

58
Lower Ratings Analysis
  • Areas-of-Need Focus
  • Secondary and Postsecondary Connections
  • Ensuring Best Results for Special Populations
  • Newness of the Automated Accountability Systems

59
Areas that are Targeted Improve Needs
  • Accountability data from Tech Prep
  • Student Follow-up data reliability
  • Clarify mission
  • Improve collaboration with agencies
  • Improve services and outcomes for Special
    Populations

60
Significant Concerns Targets
  • Systematic collaboration with Tech Pep and for
    equal access for special populations
  • Secondary/postsecondary collaboration and state
    wide articulation agreements
  • Automated plan applications
  • Preparing special populations for further
    learning and high skills, high wage occupations

61
Significant Concerns Targets
  • Assess academic attainment in the accountability
    system
  • Use accountability data to shape continuous
    improvement
  • Determine a reliable assessment of technical
    skills in the accountability system

62
Use of Survey Results
  • A structure for on-going program improvement
    planning
  • A move to beyond mere compliance
  • A data base for self-improvement
  • A baseline data base for where we are on program
    improvement
  • A needs assessment document for program emphasis

63
Tennessee Action Plan
  • 2020 Task Force
  • Visioning based on four pillars
  • Academic Vision
  • Articulation-transitions
  • Communication
  • Professional Development

64
Tennessees Action Plan
  • Divisions on-going Action Plan
  • PMOCs (Project Management Oversight Committee)
  • eTIGER data reporting
  • CATI academic integration
  • Web Design restructure
  • Serving Special Populations
  • Curriculum alignment with Post-Secondary

65
Tennessee Action Plan
  • Divisions on-going Action Plan
  • Perkins online Report Card
  • 20/20 Vision Task Force
  • Name change legislation
  • Transitions from high schools to colleges and
    careers (SREB)
  • Postsecondary Challenge Grants for community
    colleges
  • Statewide articulation agreements

66
What we have learned from the Self Assessment
  • An excellent Planning Tool
  • Division 2005 Retreat will focus on the
    assessment results
  • We have archived our files to document where we
    are and will use them to continue to document our
    strengths and weaknesses
  • Future Monitoring Visit Format

67
Tennessee Secondary Program Data 2004-05
  • Total Course Enrollment296,224
  • Agricultural Education------------------30,61
    0
  • Business Technology--------------------81,819
  • Contextual Academics--------------------8,437
  • Family and Consumer Science---------51,896
  • Health Science Education--------------18,378
  • Marketing Education--------------------15,007
  • Technology Engineering-----------------8,780
  • Trade and Industrial---------------------80,5
    76

68
Individual Student Demographic Data
  • Total 9-12 CTE students-170,134
  • Total 9-12 High School Students-284,615
  • CTE Percentage of State Average-59.78
  • Students with disabilities-28,135 (16)
  • Economically Disadvantaged-90,318 (53)
  • Limited English Proficiency-2,520 (1)

69
Our Mission
  • Tennessee
  • Department of Education
  • Helping Teachers Teach and
  • Students Learn
Write a Comment
User Comments (0)
About PowerShow.com