Title: Presenters:Honor Fede, ELCC
1ELCC Program Reviewer TrainingPart 2
- Welcome!
- Presenters Honor Fede, ELCC
- Margie Crutchfield, NCATE
- Please remember to set-up your audio
- Go to tools, then audio, then audio set-up
wizard - For those of you who have a microphone, we will
do a microphone sound check one by one - 130 pm- First audio sound check
- 145 pm- Final audio sound check
- 200 pm- Session will begin
2Session Agenda
- The Program Review Process (Margie)
- General debrief (Margie and Honor)
- Discussion of reviews (Honor)
- Preparing a recognition report (Margie)
- Resources
3 The Program Review Process
4Timeline 3 cycles
- In Fall, you will be asked to complete your
reviews between October 15 and November 15 - In Spring I you will be asked to complete your
reviews between March 1 and April 1 - Spring II Revised reports May 1 to June 15
5Review Process
- Institutions submit reports electronically 12
months prior to site visit - You will receive an Availability and Conflict of
Interest Form - You will receive notification that you have or
have not been assigned to a team of three
reviewers - If assigned, you will be sent a login id and
password - One reviewer is designated as Lead
6 - The Lead reviewer should contact each member of
the team and establish a work timeline and
process. - Each team member should complete a report.
- Team members read each others reports and
discuss differences - NCATE will pay for conference calls
- The Lead reviewer then compiles a final report
which is available to the entire team. Each
member of the team is responsible for the final
report.
7- Lead reviewer posts final report to the NCATE web
site - Final reviewer report, in most cases, is reviewed
by SPA Audit Team - Audit Team posts the final audited report
- NCATE posts report to institution
- BOE team accesses report for on-site visit
8ELCC Audit Team
- Appointed by National Policy Board for
Educational Administration (NPBEA) - five members
- Review every Program Recognition Report to ensure
fair and unbiased team reports - May overturn a team recommendation if
inconsistencies and errors in reporting are found - Will also review reports that have been flagged
by NCATE staff
9 Teamwork
- Review teams should use whatever approach works
best reviewers should contact each other via
email or phone to discuss reviewer report
deadlines and set up team meetings immediately
after assignments are made. (note NCATE will pay
for conference calls.) - Reviewers work independently and submit
individual reviewer reports on AIMS system by
deadline determined by team. - Teams meet electronically to discuss reviewer
findings from these reports and the lead reviewer
compiles the final team report based on composite
work found in individual reviewer reports and
discussions within the team. -
10Ethics and Confidentiality
11NCATE Code of Conduct
http//www.ncate.org/programreview/codeOfConduct.a
sp?ch37
12- NCATE board members 11, program reviewers, and
staff shall conduct themselves at all times while
representing NCATE as thoughtful, competent, well
prepared, and impartial professionals. To assure
institutions and the public that NCATE reviews
are impartial and objective, to avoid conflicts
of interest, and to promote equity and high
ethical standards in the accreditation system,
board members, program reviewers, and staff shall
follow the Code of Conduct. They should exclude
themselves from NCATE activities for any other
reasons not listed in the Code which may
represent an actual or perceived conflict of
interest. Violation of any part of the Code will
result in the board members removal from the
board. Program reviewers and staff members will
also be subject to disciplinary action, including
dismissal.
13Roles of Reviewers
- Judge alignment of assessment and candidate data
with ELCC standard elements (e.g., 2.1, 3.2, etc) - Clearly communicate strengths and weaknesses in
relation to the standard elements (e.g., 2.1,
3.2, etc) - Make a judgment with a clear and open mind
- Make a judgment based on accepted criteria rather
than personal bias
14- The job of the reviewer is not to pass or fail
programs, but to make as objective an assessment
as possible about the degree to which a given
program meets the SPA standards.
15Reviewers have ethical obligations to be
- Objective
- Reflective
- Conscientious
- Discrete
16Reviewers should avoid
- Discussing program review results with those
outside of the review system. - Revealing deliberations or personal doubts about
review results. - Suggesting specific changes to institutional
programs. - Using subjective or opinionated language.
- Writing comments that are jocular, humorous,
flippant, comparative or otherwise abusive.
17Decision-Making
18 Decisions on Standards
- Standard element (e.g., 1.3 or 2.4) is met when
there is sufficient evidence from one or more
assessment source that the content of the
standard is covered by assessments and candidates
perform at a minimally acceptable level. - Standard element (e.g., 1.2 or 4.3) is met with
conditions if the assessments are viable, but
there are no data OR if assessments are on the
right track and data demonstrate candidate
adequacy.
19Making the Final Decision
20Final Decisions
- The program is nationally recognized.
- The program is nationally recognized with
conditions. - Insufficient data
- Insufficient alignment
- Poor assessment, scoring guides, etc
- 80 rule
- Further development required (if first time
program has ever been submitted)/national
recognition with probation (if program was
recognized by SPA during the last accreditation
cycle). - NCATE staff will determine which of the above
applies
21Making reviewer decisions on assessment quality
- Does the program have in place a series of 7 to 8
key assessments that taken as a whole
demonstrates that candidates know and are able to
do the concepts found within the ELCC standard
elements? (e.g., 1.5 versus 2.2). - Do candidates perform appropriately on at least
one content assessment (1, 2, or 6) and one
skill assessment (3, 4, or 7)?
22In general.
- Recognized
- Program isnt perfect, but is well on its way,
understands performance assessments, alignment of
standards and assessments, etc. - Majority of standards are aligned and at least
one content assessment and one skill assessment
is aligned to standards and data is given on at
least one assessment. - Recognized with condition
- Program understands performance assessment and
alignment, but may have some serious deficiencies
in some scoring guides and/or some assessments - May not have sufficient data
23- Further development required/recognized with
probation/not recognized - Program really misses the mark, little or no
alignment with the standard elements within the
assessment descriptions or scoring guides. - No data is given on any assessment.
- NCATE staff will determine which of the above
decisions should be given.
24What next?
- If recognized with conditions, program submits
Response to Conditions report within 18 months - If fdd/nrp, report can submit up to two revised
reports over a 12-14 month time frame - Original team also receives these second reports
if possible
25Overall Questions about Process
- Initial questions about the process?
- Questions about evaluating assessments?
- Questions about making decisions?
- Were there particular standards that were
challenging to evaluate?
26Writing the Recognition Report
27Part A. Recognition Report
- A.1. SPA Decision
- Include any conditions, if applicable.
- A. 2. Test Results
- Take this information from the Cover Sheet of the
Program Report - 80 of completers in at least the previous year
must have passed the state test - This rule is waived if less than 10 completers in
the last 3 years - A. 3. Strengths
28Feedback on your reviews Part A
- The following is a good example of a reviewer who
has included a summary of strengths as it relates
to specific assessments but it could also be on
the overall quality of the report or activities
or faculty etc. Strengths should be noted. The
ELCC wants a well-rounded review. - Also, note how the reviewer has commented on
findings from a previous team report review.
Theyve looked at the previous report and carried
over those comments.
29(No Transcript)
30Examples of well-written Summary of Strength
statements
- emphasis on reflective practitioner throughout
education coursework - use of an action research project that focuses
candidates on their effect on student learning - beginnings of a comprehensive program
assessment system that when refined should
provide useful, current information on candidate
success for improving the program and tracking
candidate progress
31Part B. Status of Meeting SPA Standards
- Designate each standard as Met (M) or Not Met
(NM) for the program being reviewed - For every element that is NM, include an
explanatory comment - The comment should provide enough information to
the program for faculty to understand and remedy
the issue
32Feedback on your reviews Part B
- Be sure to complete each section of the National
Recognition report - For a Response to Conditions or Revised Report be
sure to only review those standards and
assessments that were Not Met or Met
w/Conditions in the last team report. Copy the
marked Met standards from the previous team
report into your recognition report. - If you mark anything as Met w/Cond or Not Met
must provide a comment on why.
33Things to Do
- In General
- Be sure to complete each section of the report.
34Feedback on your reviews Part B
- Be sure to be consistent
- If you mark a standard element as Met dont
need a comment. (Note first example) - However, the next example marks it as Met but
then lists a comment noting an area of
non-compliance. This should be changed to Met
w/Conditions.
35(No Transcript)
36(No Transcript)
37Feedback on your reviews Part B
- Lets talk about Standard 7.0 Internship.
- Standard elements 7.1, 7.2, 7.4, 7.5, and 7.6 are
about the design of the internship not candidate
performance. - Standard element 7.3 addresses Assessment 4
Internship. - Information may be found in the overview section
of the institutional report under practicum or
within overview narrative for Assessment 4
Internship.
38(No Transcript)
39Part C. Evaluation of Program Evidence
- C.1 Candidates knowledge of content of SPA
standards - C.2 Candidates ability to understand and apply
pedagogical and professional content knowledge,
skills, and disposition. - C.3 Candidate effects on student learning and
creation of environments that promote student
learning
40Feedback on your reviews Part C
- C.1 Content This reviewer did a good job of
providing specific feedback on the alignment of
the assessment description, scoring guide, and
data to specific ELCC standard element for each
of the content assessments Assessment 1, 2,
and 6. - The reviewer outlined which standards were being
covered by the assessment which is a plus and
they identified the assessment activity. - Any problems were clearly noted for each
assessment.
41Be careful to be consistent in your comments so
as not to confuse the Audit Committee.
- For instance, in one report the team gave
National Recognition Status to a revised
report - PART B - STATUS OF MEETING SPA STANDARDS lists
all standard elements as met. - Then in,
- PART C - EVALUATION OF PROGRAM REPORT EVIDENCE
- C.2 contains the comment The descriptions for
professional skill Assessments 3, 4, and/or 7
refer to ELCC Standard Elements 1.1, 1.2, 2.2,
2.3, 2.4, 3.2, 3.3, 4.1, 4.2, 5.1-5.3, and
6.1-6.3. ELCC Standard Elements 1.3-1.5, 2.1,
3.1, and 4.3 are not referenced in these
assessments. Assessment descriptions align with
specific ELCC Standard Elements, scoring guides
align with assessment descriptions and measure
progress on specific ELCC Standard Elements, and
scoring guides are evaluation tools for measuring
progress on assessments. These assessments were
revised based on July 2008 Recognition Report
results and implemented fall 2008. The first data
collection is scheduled for the end of the fall
2008 semester.
42Feedback on your reviews Overall
- Your main job as a reviewer is to evaluate the
quality of the assessments as they relate to the
ELCC standard elements. - Dont get hung up on organizational details like
whether Section III aligns to Section IV. Or
wrong numbering of the assessments (e.g.,
Assessment 5 is marked as the internship
assessment when it should be Assessment 4. Look
at the overall picture do they have three
content assessments and three skill assessments
maybe they are numbered wrong Thats confusing
but OK. - Same goes with editing the report for typos,
spelling, writing quality that is not the
purpose of your review. - Some organizational detail is important if you
are missing important information that makes it
impossible to review the assessments.
43Feedback on your reviews Part C
- C.1 ContentHere is another good example of
detailed reporting on the quality of the
assessment description, scoring guide, and data.
Reviewer has included more information on the
quality of the activity.
44(No Transcript)
45Bad Example of one-line comments
46Feedback on your reviews Part C
- C.2 Professional Skills The following reviewer
has not provided any comments about the quality
of the assessments provided. Yet, in another
section of the report a weakness was stated The
scoring guide needs to be slightly revised to go
along with Assessment 5. More explanation is
needed to understand why this statement was made.
- On areas of Non-Compliance, be careful about
making one line comments without any explanation.
Auditors and institutional faculty will want to
know what was wrong with the scoring guide that
the reviewer made this comment. - Be sure to fill out all sections of the report
dont leave blanks.
47(No Transcript)
48Example two one liner statement
49- If you state that an assessment or a standard is
not clear or not aligned be sure to state why
you say that. Give the reader enough information
to understand the problem. - The program report provide limited information
about the assessments. Greater information and
clarification is needed. - What does this mean?
The reviewer has not given enough specific
information in C.1, C.2, or C.3 to understand
this comment.
50Beware of making personal judgments that cant be
substantiated by concepts found in the ELCC
standard elements.
- Such as Assessing candidates through a Vision
Paper administered after 12 hours in the program
is too late to determine their match with the
program mission and vision. - Stay away from personal biases about how the
program or assessments should be designed or
offered we dont have a standard on this. - Part E Areas for Consideration may be used for
personal suggestions outside of the standards as
long as they are used wisely. Such as faculty
qualifications comments or activity suggestions
(things that relate to the larger NCATE context
or are helpful to the institution).
51Feedback on your reviews Part C
- C.2 Professional SkillsThe following example
does not give the reader enough important
information about the quality of each of the
assessments. The reviewer has listed the activity
but we dont know anything else.
52(No Transcript)
53When Making the Final Report Decision
Recommendation
- Look holistically across all the summative
statements for each assessment within Section
C.1, C.2, and C.3. Are the ELCC standard
elements met in at least one of the assessments
in C.1 Content, C.2 Professional Skills, and
C.3 Effects on Student Learning. If so, they
should be given either National Recognition or
Recognition with Conditions. There may be
problems with one or more assessments or
standards but overall do they align with majority
of elements within each standard (e.g, 1.0, 2.0).
- We want to know if each of the assessment
descriptions, scoring guides, and data tables are
aligned to our standard elements. Then look
holistically for your final team decision.
54Feedback on your reviews Part C
- C.3 Effects on Student Learning Remember to
look for either a survey or assessment that
demonstrates how candidates are evaluated for the
ability as leaders to support student learning
(achievement) related to our standards.
55(No Transcript)
56What do we mean by Alignment?
- Does the assessment activity and/or scoring guide
align to the majority of concepts found in the
standard element (e.g. 2.3). Does not mean every
single sub-element (e.g., 2.3a versus 2.3b) is
mentioned word-for-word in the assessment
description and/or scoring rubric but that the
overall concepts are defined holistically by the
majority of sub-elements can be seen in the
assessment description/scoring guide. - Common problem Assessments aligned to standard
as whole (e.g., ELCC 1.0) not broken out by
standard element or vice versa.
57Other Common Problems found by Reviewers of
Assessments
- Data is not aggregated by the categories and
standards outlined in the Assessment scoring
guide. - Assessment scoring rubrics are aligned to the
ELCC standards but show no relationship to the
Assessment Activity description. - ELCC standards are lumped together into one
criteria measure rather than evaluated separately.
58Part D. Evaluation of the Use of Assessment
Results
- Evaluation of Section V of Program Report
- Is it clear that assessment evidence is used by
the institution in evaluating the program,
counseling candidates, and revising courses or
other elements of the program? - Has the institution made program changes based on
assessment evidence? - Do you find the faculty interpretations
consistent with the evidence provided in the
program report? - Are the implications for programs that appear in
this section of the program report derived from
the interpretations?
59Nice Complete Statement
60Part E Areas for Consideration
- Not standards-based
- These should be more global concerns or issues
- Broad, programmatic issues
- Will NOT be addressed in a Revised Report or a
Response to Conditions Report.
61Part F Additional Comments
- F.1 Comments on Context or other issues
- F.2 Instructions for the Board of Examiners
- Could be issues not related to the SPA standards,
but related to the NCATE Unit Standards
62If the final decision is Conditions
- Part G MUST be filled
- Acts as a contract between program and NCATE
63Response to Conditions Statements
- Characteristics of a well-written Conditions
statement - CLEAR
- PRECISE
- OBJECTIVE
- STANDARDS BASED
- EVIDENCE BASED
64An example of a well-written CONDITION
- Assessments 1 and 2 The program needs to
address candidate knowledge of content by
standard (academic discipline) and by category
(sub scores) on the Praxis II exam. They must
also present aggregated candidate data (grades)
in a table or chart of candidate scores from high
to low for each standard. - Assessment 5 Candidate impact on student
learning needs to be addressed by the degree to
which secondary students learned from candidate
instruction. The unit plan assignment could well
yield such data if the candidate taught it in a
secondary classroom and conducted something as
simple as a pre and post test on the unit. - Standard 3.2 The program must identify the
instructor(s) of the Social Studies Methods
course(s) and indicate their backgrounds in
social studies education or in one of the
disciplines. - Data from all assessments for the next year must
be collected and analyzed. - Additional procedures for evaluating post
baccalaureate candidates need to be implemented
that ensure standards and indicators are met. - Concerns cited under Standard 1 must be
addressed.
65 Reviewing Revised or Response to Conditions
Reports
- If possible, the report will be assigned to at
least one reviewer from the original review. - If Revised, reveiwers will only evaluate
standards that were previously not met or met
with condtions. - If Response to Conditions reviewers only address
issues listed in Part G
66Revised or Response to Conditions Reports
- The review team that looks at this type of report
when it is re-submitted will zero in on the
conditions section to determine whether they were
met or not. - Reviewers cant reverse previous decisions on met
standards or assessments. They also cant add new
concerns or conditions unrelated to the
concerns/standards addressed in the last report. - High Stakes!
67Data Rule
- Data Rule For Spring 09
- Two years of data for all assessments (with
understanding of programs context) - If a key assessment is in a course that is taught
once a year then one year of data one semester
of data - If program is brand new or going through program
review for the very first time, one year of data
is required - Starting in Fall 09, three years of databut
state tuned!!
68 Resources for reviewers
- NCATE website http//www.ncate.org/programreview/r
esources.asp - Archived web seminars (held every semester)
- Mini videos
- Your review partners, ELCC Coordinator - Honor
(fedeh_at_principals.org), or NCATE staff - Everyone says the first review can be tough.
Dont be reluctant to consult with your fellow
reviewers and with Honor.
69(No Transcript)
70(No Transcript)
71NCATE Staff
- Margie Crutchfield, margie_at_ncate.org
- Robin Marion, robin_at_ncate.org
- Sabata Morris, sabata_at_ncate.org
- Monique Thomason, monique_at_ncate.org
- Tech Support Cora Mak, cora_at_ncate.org