Title: MEASURING THE IMPACT OF RESEARCH LESSONS FROM PROGRAMME EVALUATION
1MEASURING THE IMPACT OF RESEARCH LESSONS FROM
PROGRAMME EVALUATION?
- Johann Mouton
- www.sun.ac.za/crest
- 17 June 2004
2The storyline
- The programme evaluation (PE) literature has
over the past forty years developed a fairly
standard classification of evaluation types.
There are certain distinctions that are (at least
within realist approaches) relatively widely
accepted. I begin by discussing some key notions
in programme avluation, including the notion of
the Logic Model framework and give an example of
how it has been useful in constructing a
monitoring framework of research at the
systems/institutional level. - Part Two is devoted to a case study in research
impact assessment. In this discussion I
reconstruct the processes and events that led to
high levels of uptake and impact. - In the final part of the paper, I make some
general comments and observations about the
notion of research impact.
3Part OneImpact assessment within programme
evaluation studies
4The logic of interventions
Context
INTERVENTION STRUCTURE Programme
resources/ Activities/outputs E. g.
workshops/ Courses/ manuals
INTERVENTION MANAGEMENT Human resources Project
administration ME system
GOALS Objectives
Target group
Stakeholders
MEASURABLE OUTCOMES More knowledgeable
/ Increased competence/ Higher productivity
5Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â Â
Â
Applying the logic model to interventions
   Â
Problem
Programme aims and objectives
Resources
Activities
Outputs
What you need to carry out the activities you
have planned people, money, materials,
infrastructure
What you do with the resources you have
What the activities produce e.g. products/
deliverables/ goods/ services
Outcomes/ effects
Indicators
What happens to the target group as a result of
the delivery of the programme
Concrete and measurable signs of occurrence
which are often quantified and aggregated into
composite measures (indices)
6Types of programme monitoring
- Process or implementation evaluation
- Process evaluation verifies what the programme is
and whether it is delivered as intended to the
targeted recipients - It addresses issues about the effectiveness of
programme operations, service delivery and
whether it is successful in reaching the target
group as planned (coverage) - Routine programme monitoring and management
information systems - Gathers information on programme outputs e.g.
Number of clients served, quality of service
provided, nr of workshops delivered,etc. - Continuous monitoring of indicators of selected
aspects of programme processes or activities as a
tool for effective management - Performance measurement and monitoring
- Accountability demands require that programmes
demonstrate that they accomplish something
worthwhile often against some set standards or
benchmarks - Orientated towards the assessment of outcome,
I.e. the results of services
7Impact assessment in programme evaluation
- The basic aim of impact assessment is to produce
an estimate of the net effects of an intervention
i.e. an estimate of the impact of the
intervention uncontaminated by the influence of
other events or processes that may also affect
the behaviour or changes that a programme is
directed at achieving (Freeman Rossi) - Prerequisites for assessing impact
- The programmes objectives must be sufficiently
well articulated to make it possible to specify
credible measures of the expected outcomes - The programme must have been sufficiently well
implemented there must be no question whether
its critical elements have been delivered to the
appropriate targets
8Gross versus net outcomes
- Establishing a programmes impact, i.e. its
combined and accumulative effects (outcomes), is
identical to establishing that the programme is a
cause of a specific effect or effects. - It is important that causality not be confused
with lawlikeness. A is a cause of B usually
means that if we introduce A, B is more likely to
result than if we do not introduce A. This
statement does not imply that B always results,
nor does it means that B occurs only if A is
introduced.
Effects of other processes (extraneous factors)
Gross outcome
Effects of Intervention (net effect)
Design effects
9An example at the systems/institutional level
- The National Plan on Higher Education (2002) has
set the following five systemic goals for Higher
Education research - To redress the inequities of the past as far as
race and gender is concerned (Equity) - To ensure that HE research is in line with
national goals (Responsiveness) - To increase the volume of research output
(Quantity) - To improve the quality of research produced in
the system (Quality) - To improve the efficiency (throughput) of the
system (Efficiency)
10Domains by system goals
11An example of indicators of research (inputs,
outputs, effects) at the systems/ institutional
level
Equity Participation Responsiveness
Quality Efficiency
RESEARCH
Researchers Funding
Research process
Research outputs
EFFECTS
Extent of research utilisation R
Internal efficiency
Effectiveness
Nr of black and female researchers getting NRF
funding EQ RD expenditure P Contract income
R THRIP funding R Numbers of rated
researchers Q NRF grants in focus areas R
Nr of black, female and young PhDs EQ Nr of
research publications P Nr of ISI publications
Q Publication trends in line with national
goals R Research co-authorship R
Publications per FTE researcher EF Publications
per RD expended EF
12Part Two A case study in research impact
assessment
- CRESTs research programme on the demographics of
SA scientists
13The CREST research on the demographics of the SA
ST workforce The origins
NRTA RD Survey
Survey database
Development of SA Knowledgebase
Public presentation
DACST not interested in its further development
1997 ---------------------1998
---------------1999---------------2000
----------- 2001
14The production of scientific knowledge in SA
Race trends
15The production of scientific knowledge in SA
Gender trends
16The production of scientific knowledge in SA
Age trends
17Initial dissemination of the results (2001 2002)
- Public presentations
- Science in Africa symposium (October 2001)
- SARIMA founding meeting (February 2002)
- NACI Discussion Forum in Pretoria (April 2002)
- Publications
- (with A Bawa) Research chapter in
Transformation in higher education in SA (July
2001 submitted published in 2002) - NACI Facts and Figures 2002
- South African science in transition. Science,
technology and society Vol 8 (2003)
18Accelerated uptake and utilisation of the CREST
results
- NACI commission on South African Science Facts
and figures (November 2001/ Finished April 2002) - COHORT request (June 2002) for a think piece on
A new generation of scientists (report
submitted in December 2002) - All three slides on the frozen demographics -
Department of Science and Technology Final
version of the new RD Strategy (August 2002) - Request by the Academy of Science of South Africa
for a document on Promoting worldwide ST
capabilities for the 21st century (August 2002
report submitted/ symposium held October 2002) - Reference in numerous press statements by
ministers, directors-generals and senior
officials of the DST, DoE, NRF, SAUVCA and others - Requests from a number of universities and
professional societies (Physics/ Marine Science)
for the data.
19Mapping the effects of the CREST research
programme
IE111 (journals)
IE11 (expanded research programme)
IE1 (better understanding)
IE112 (language)
IE21 (expanded dbase)
IE2 (dbase development)
IE222 (OSTIA)
UE1 (inform RD strategy)
?
Leverage funds for RD
UE21 DST UE22 NRF
UE2 (inform strategic planning)
UE31 (UCT)
UE3 (inform HR policy)
UE32 (US)
ACCUMULATIVE EFFECTS
20Some observations on the process
- Researcher directed and driven dissemination soon
complemented by user/demand-driven dissemination - Differentiated effects The research produced a
range of very different kinds of effects across a
wide range of actors - The majority of the effects were unexpected and
were not foreseen - One effect often spun-off multiple subsequent
effects
21Part ThreeConcluding observations
22Where programme evaluation tools are useful
- Outputs and outcomes or effects not to be
confused Research outputs are the immediate
(epistemic) results (findings/data) that flow
from the research process. Outcomes (effects)
imply some change/ some recognition of the value
of these results and subsequent utilisation or
uptake thereof other scientists who cite my
results, users who apply the newly acquired
information in various ways through technology
development or improvement, policy formation,
improvement of practice, and so on. - Research outputs can generate multiple effects
One discovery or research finding can produce
many and diverse outcomes (some immediate which
we will call the proximate effects/ others
accruing over a longer time frame which we will
call accumulative effects) - Effects in turn often spin-off other effects
(multiplier effect) and new effects (often
unintended) emerge over time
23Knowledge production, uptake or utilisation and
impact
Modes of research
Research (based) Outputs/ results
Production
New knowledge
Knowledge applications/ technologies
CODIFIED New facts/ theories/ models
EMBODIED Students
Uptake
Policies/ legislation/practices Process
technologies Product technologies Tests/
scenarios/ systems
Scientific community
Society
Industry
Government
Impact
24But there are also fundamental differences
between intervention and research programmes !
- An intervention programme a structured set of
goal-driven activities that are aimed at
producing certain (measurable) positive
outcomes/effects for a specific target group - A research programme an assemblage of loosely
interrelated, sometimes converging but also
diverging, activities (research lines) intended
to produce credible results that have epistemic
(knowledge generating) and non-epistemic
(symbolic/social/ technological/ economic) value - Proper implementation of an intervention
programme (a training programme or poverty
alleviation programme) implies that there are
expected outcomes that can be predicted that
are reasonably determinate - Good execution of a research programme still does
not imply that the outcomes are determinate
research essentially remains an open-ended
process of unforeseen discoveries and findings
25Concluding observations
- The impact of a intervention AND research
programmes is usually.. - Only evident after some time has lapsed (the
notion of emergence) - The combined result of various effects or
outcomes that together produces the benefits to
the users (the notion of accumulation) - Made up of very different (in nature) kinds of
mutually reinforcing effects (the notion of
differentiation) - BUT whereas intervention programmes have both
intended AND unintended effects (the notion of
goal-free vs. driven evaluations) - Research programmes are more than likely to
include unforeseen than foreseen effects (the
notion of indeterminacy)
26Conclusions
- Measuring the impact of scientific research
(programmes) means observing/estimating the - The accumulated, differentiated proximate and
emergent effects some of which will of
necessity be unforeseen
27Thank you