Title: Building Evaluative Capability in Schooling Improvement
1Building Evaluative Capability in Schooling
Improvement
- Helen Timperley, Judy Parr,
- Rachel Dingle, Margie Hohepa, Mei Lai, Deidre le
Fevre
2NZ Policy Context
- Policy goal That all students have knowledge and
skills to be successful citizens - Current reality is that a sizeable number of
students are in tail (e.g. PIRLS, 2006) - NZ has essentially self managing schools with a
relatively low accountability system (no mandated
testing) - Tension is how to achieve a policy goal when
schools are underperforming
3Policy Responses
- There are a number of Ministry of Education
sponsored initiatives (voluntary) aimed at
raising student achievement e.g. - Schooling Improvement
- Extending High Standards Across Schools
- The Literacy Professional Development Project
- Numeracy Development Project
- Assessment to Learn
4Schooling Improvement
- NZ Ministry has premised its SI efforts on
networks/ clusters of schools working with
Ministry officials and external providers to
raise student achievement - SI is a school owned change model begins with
analysis of student achievement, then cluster
leaders (with external expert) develop a
cluster-based theory for improvement as basis for
change
5Schooling Improvement
- Are up to 20 clusters (just starting to 8 years
in existence) of varying size (5-30 schools) - Each cluster in SI reports regularly to Ministry
- reporting includes student achievement but - Clusters report in varying ways
- Use different tools to measure same construct
(problematic in terms of achievement data) - The quality of the data is variable
-
6The Project
- As a result, little known at system level re
overall effectiveness - Ministry commissioned an evaluation of SI as a
policy intervention involving - identification of practices associated with
particular profiles of achievement, - together with a brief to build evaluative
capability within SI
7Our Challenge
- With no systematic student achievement data (let
alone other data) to draw on, we had to devise a
methodology to address this brief - Began by
- developing a notion of evaluative capability
- defining criteria for improvement in student
outcomes and assisting schools to understand
measurement issues - identifying practices likely to impact on these
outcomes
8What is Evaluative Capability? (our starting
point)
- Central to evaluative capability are systems
(structures, processes, knowledge, resources,
tools) for - Identifying evaluation questions and information
needs - Who needs to know what, when and why
- Establishing valued outcomes
- Identifying and making salient to members
particular outcomes that are valued by the
participants - Identifying a shared understanding of the
underlying causes of identified problems over
which those involved have leverage
9What is Evaluative Capability?
- Selecting and taking courses of action that
address the identified causes - That are understood in terms of the available
knowledge related to underpinning theories about
what is effective and the empirical evidence
associated with their evaluation - Ensuring overall coherence among activities
designed to achieve the valued outcomes - Checking progress
- Providing relevant and accurate information about
progress towards the valued outcomes - Identifying any anomalous or unintended
consequences
10What is Evaluative Capability
- Ensuring that all those who need to know can
answer their questions with timely and accessible
information - Making relevant information and knowledge
accessible to those who have an interest - Critiquing and negotiating meaning with such
- Interpreting the information in the light of
current understandings in relation to the
decisions to be made - Making adjustments / changing courses of action
- Engaging with multiple sources of knowledge to
develop and take informed action in response to
the information generated - Taking informed action in light of the
information gathered - Embedding the demand to know as a socialized
practice -
11What We Would Like You to Do
- Critique our definition of evaluative capability
in terms of its theoretical framing and practical
significance
12Student Outcomes
- Student outcomes
- Student achievement (largely literacy focused)
- Include valued outcomes for Maori (cultural
language regeneration educational achievement) - Identify what constitutes desirable progress
(issues of different instruments, time spans,
ways of measuring progress)
13Practices Impacting Outcomes
- Likely practices encompassed within 7 themes
- Cluster structures and processes
- Evaluative capability
- Theories for improvement sustainability
- Teacher leader professional development
- Classroom practice
- Student achievement data analysis
- Maori medium education
14How to Address the Brief
- Given lack of knowledge of what is happening
how effective this is within across SI
clusters, our challenge was how to this find out?
15Our Approach
- We decided on a two phase approach
- Began with an overview, an inventory of cluster
activities that aimed to - Map activities of the clusters
- Provide formative information to help Ministry
and clusters move forward by means of critique of
current practice - Then, continue with in-depth work with successive
groups of clusters -
16The Issue Finding out what is really happening?
- Aim is to discuss methodological difficulties in
obtaining an accurate picture of network
activities and effectiveness so as to identify
issues relevant to those involved in SI (in terms
of improving the quality of activities and of
student success)
17Inventory Method
- 15 clusters of schools voluntarily participated
- Interview survey with 112 school leaders, 22
cluster leaders - Interview began by completing diagram of
structures and functions nominating leaders
roles - Asked about specific practices to uncover
clusters theories-in-use
18(No Transcript)
19Method contd
- Asked in reference to project put most energy
into in 2007 about, for example - How judge success
- How monitoring progress
- How obtained info about classroom practice in
relation to project - Whether relevant student achievement data shared
- Purposes for which data used
-
20Method contd
- Most interview questions open-ended and coded
using pre-determined categories at time of
interview. - Some questions involved rating and indicating
reasons for rating. Latter coded by researchers. - Documents analysed included Ministrys operations
manual (2008) and cluster-specific documents held
by the national office
21Feedback Another Data Source
- Cluster specific reports written around themes
- Oral feedback to each cluster
- Report distributed and cluster responded
- Synthesised analytical report written for
Ministry - Feedback sessions tailored to specific groups
(Ministry policy makers cluster co-ordinators
and professional developers)
22The Picture
- Sense that participants able to talk the talk
- e.g. 81 identified main aim of their project as
raising student achievement 62 claimed they
related cluster patterns to achievement self
rating of evaluative capability 3.8 (5 point
scale) - Observations (informal at start of in-depth
work) suggested inventory data presented a rosier
picture than the reality - Analysis of interviews documents yielded issues
and discrepancies -
23Issues
- Pace of change
- Limited sense of urgency to develop effective
solutions in established clusters (irrespective
of achievement shifts or time taken) - Cluster structures and processes
- Confusion in perception of roles, co-ordinating
structures and mechanisms and their purposes - Unclear accountabilities
24(No Transcript)
25Discrepancies
- Sources of discrepancy
- In responses from members of same cluster
(particularly between cluster leaders and school
leaders)- fact, interpretation, belief - Between interview responses and documents
- Responses vis a vis alternative theories of
effectiveness
26Discrepancy Within Clusters
- Ministry of Ed coordinators identified self as
cluster leader their nomination as such by
principals varied across clusters - Who decided the foci for professional learning?
- Cluster leaders nominated self (70)
- School leaders nominated cluster leaders (33)
- Cluster leaders nominated individual school (0)
- School leaders nominated individual school (47)
-
27Discrepancy Between Interview Responses and
Documents
- What funded as Maori Medium provision and cluster
recognition of existence of such - Cluster documents described cluster processes for
sustainability BUT half school leaders said
nothing in place for sustainability and those who
believed there was something in place did not
refer to documented processes
28Discrepancy re Theory of Effectiveness
- Omission In describing the role of cluster
leader and purposes of structures- accountability
rarely mentioned - Competing theories of effectiveness Quality
specificity of clusters theory for improvement
varied- most for compliance not operational
reasons (e.g. operational different to official
or multiple theories not integrated) - Theories lack evidence of effectiveness Most
theories included PD changes in practice BUT
little evidence re adequacy of initial practice
(or impact PD on practice)
29Resolving Discrepancy Through Feedback Process
- Documents obtained through national office
(signed off as appropriate by local MoE
coordinator) - Feedback process highlighted varied extent to
which these documents accurately reflected
cluster functioning and/or extent to which were
collaboratively developed
30Towards Explanation Feedback Process as Further
Cross-Checking
- Interview reporting documents sometimes at
variance e.g. re classroom observation, theory
for improvement - Documents obtained through national office
(signed off as appropriate by local Ministry
coordinator) - Feedback process highlighted varied extent to
which these documents accurately reflected
cluster functioning and/or extent to which idea
within them or reports were collaboratively
developed
31Alternative Explanations?
- For these issues and examples of discrepancy,
have we overlooked other potential explanations? - To what extent do you think our methodology
played a part in the issues identified?
32Discussion
- Interviewing involves participants in meaningful
ways and introduces the larger project to the
cluster through people not paper - Question is the extent to which self report data
- reflect the reality
- capture both espoused and theories in use
- In order to progress a more meaningful
understanding of what is happening to provide
information for a way forward
33Conclusion
- The results of our analyses, including those
arising from various forms of cross-checking,
show considerable potential for raising issues - Outcome has been widespread discussion within
clusters and across several interest groups
(Ministry officials, cluster co-ordinators, PD
providers) - However, we consider such survey methods most
productive when coupled with observations in
discussing nuances of practice