Title: Learning from quality evaluation
1Learning from quality evaluation
- creating conditions to support enhancement
2Overview
- Academics responses to quality evaluation
- Some implications for policy makers and quality
services - Factors influencing responses to audit and
evaluation - A cautionary note quality enhancement is a messy
business - Improving outcomes to quality evaluation
- Creating conditions to support enhancement
3Academics responses to quality evaluation
- quality as ritualism and tokenism
- procedures used to satisfy external
accountability - quality enhancement becomes residual and
marginalised - quality as impression management
- preparations for evaluation are stage-managed
- behaviour is scripted
- quality as burden
- evaluation viewed as extra work
- quality associated with compliance culture
4Academics responses to quality evaluation
- quality as suspicion of management motives
- quality evaluation a management tool
- threat to academic autonomy
- quality as discipline and technology
- better systems, better bureaucracy
- but not quality improvement
- perception quality can improve without
evaluation - quality as a culture of getting by
- evaluation requires ownershipbut staff
enthusiasm varies - situational constraintsconfusing demands
5Some implications for policy makers and quality
services
- no blueprint for quality evaluation systems
- impact of context and circumstance no blank
sheet - system users shape, adapt, subvert, or change
quality policy - quality essentially contested managerialism
versus innovation? - implementation gap are accountability and
improvement reconcilable?
6Factors influencing responses to audit and
evaluation
- hypothesis
- If certain conditions and requirements are not
met, a - universitys quality system may fall into
disrepute - and effective follow-up to evaluation will be
- undermined.
- some key factors
- reciprocal accountability and the psychological
contract - total quality
- closing the quality loop
- policy and strategy overload
7Factors influencing responses reciprocal
accountability and the psychological contract
- Just as in most work situations there is a
legal contract - between the organisation and the individual...so
there is an - implied, usually unstated, psychological contract
between the - individual and the organisation, results that
will satisfy certain - of our needs and in return for which we will
expend some of - our energies and talents (Handy, 1993).
- quality evaluation should take account of staff
values and expectations - importance of reciprocal accountability and
mutual responsibility
8Factors influencing responses total quality
- What we have, at best, is partial quality
management all it covers is course management.
It does not monitor many service aspects
(Respondent 34, Central Quality Unit). - most forms of quality evaluation do not penetrate
total quality issues - lack of transparency influences perceptions of
evaluation outcomes - ...perception that accountability is not evenly
spread
9Factors influencing responses closing the
quality loop
- Problems local to the department such as the
- method for handing in assignments have been
- addressed, whereas those requiring action
elsewhere - in the institution (such as library stock) are
still - outstanding.
- (External Quality Evaluation report Humanities)
- a common factor which frequently brings quality
evaluation into disrepute is a perceived lack of
effectiveness in closing the loop
10Factors influencing responses policy and
strategy overload
- Respondent 24 (Senior lecturer, Computing)
- And there are conflicting messages. I mean, its
- whats flavour of the month.
- Respondent 25 (Senior lecturer, Business
Studies) - Research. No research. Lots of classes. Not so
- many classes. Income generation. Oh, forget
- thatGet in the class and teach.
- growing volume of policies and strategies
- staff report confusion and resignation
11A cautionary note quality enhancement is a
messy business
- gap between policy intentions and realisable
outcomes - we work at the edge of chaosthings dont work
as we intended (Tosey, 2002) - institutional factors may impede quality
teaching and learning (Biggs, 2002) - some common QA procedures have the opposite
effect to that intended - Biggs cites student feedback, distinguished
teacher awards, external assessment - much quality evaluation activity makes no direct
contribution to enhancing teaching and learning - need greater honesty and realism in
quality-related thinking and practice
12Improving outcomes to quality evaluation
- performance and integrity of quality systems
- more enhancement, a little less regulation
(Brown, 2002) - evidence-based approach to quality related policy
and practice - creating conditions to support enhancement
13Improving outcomes performance and integrity of
quality systems
- Ideal requirements...
- quality evaluation is perceived as making a
difference - quality evaluation contributes to organisational
learning - feedback and communication are effective
- follow-up not constrained by volume of policies,
targets, and action plans - ...improving quality technology is necessary
but not sufficient for better evaluation outcomes
14Improving outcomes more enhancement, a little
less regulation
- While the forces of accountability are strong,
those devoted to improvement, including the
promotion of innovation, are fragmented (Brown,
2002). - much quality assurance and quality evaluation has
been conservative and inhibiting - quality evaluation tends to be accountability-led
not enhancement-led - imbalance between regulation and development
creates obstacles to improvement - apply a strict test does quality evaluation lead
to quality enhancement and improvement...and is
there evidence to illustrate this
15Improving outcomes evidence-based
quality-related policy and practice
- thought for today
- most quality evaluation systems do not generate a
robust evidence base to illustrate what works in
practice for quality enhancement, and why it
works - features of research-informed policy making
- use best evidence available from various sources
- take long term view of likely effect and impact
of policy - constantly review policy to ensure it really
deals with the problems it is designed to solve - learn from experience of what works and what
doesnt through systematic evaluation
16Creating conditions to support enhancement
findings of an illustrative case study
- Institutional self-study (commissioned by UK
Learning and - Teaching Support Network)
- need to track, audit, and disseminate innovation
and good practice - importance of local/departmental cultures and
communities - mechanisms for addressing academics varied
responses to enhancement initiatives - importance for academics of genuine enhancement
not over-formalised systems - the less evaluation is associated with
accountability...the more positive the
involvement of academics - strong interest in good practice links with other
HEIs what works elsewhere!
17Creating conditions to support enhancement case
study (contd)
- Adjustments made to internal quality evaluation
- annual monitoring reports include evidence-based
approach to quality enhancement - internal audit of courses to provide information
on enhancement and dissemination - improved links and communication between centre
and local level - developing capacity to research and evaluate
- knowledge and evidence-base used to enhance
quality - evaluate the evaluators!