Evaluation - PowerPoint PPT Presentation

1 / 58
About This Presentation
Title:

Evaluation

Description:

Evaluation & Dissemination Martin Oliver & Grainne Conole An overview of the afternoon A recap of EFFECTS & of the topics for today An introduction to evaluation and ... – PowerPoint PPT presentation

Number of Views:72
Avg rating:3.0/5.0
Slides: 59
Provided by: MartinO155
Category:

less

Transcript and Presenter's Notes

Title: Evaluation


1
Evaluation Dissemination
  • Martin Oliver Grainne Conole

2
An overview of the afternoon
  • A recap of EFFECTS of the topics for today
  • An introduction to evaluation and dissemination
  • An emphasis on using these as strategic and
    political tools
  • Asking for you to think about experiences and
    understanding, and using these to generate ideas,
    principles, etc. (A file with this information
    will be circulated afterwards.)
  • Hand out some papers these slides at the end

3
Setting the scene
  • In 1997, professional development of academics
    was put on the national agenda
  • A group of people got together to think about how
    to develop academics to use new technologies
  • 1998 EFFECTS (Effective Frameworks for
    Embedding CIT with Targeted Support) project
    starts with TLTP funding

4
Setting the scene
  • Committed to a number of values, including
  • Local implementation (not one size fits all)
  • Scholarship (dont just do it, think about it)
  • Consultation and discussion (this isnt just
    about us)
  • Reflected in the project outcomes
  • Strong commitment to evaluation
  • Strong commitment to research
  • Track record of dissemination
  • Also incorporated into learning outcomes

5
Setting the scene
  • We wanted staff who were informed, not just
    trained
  • We wanted to help people develop (not do things
    to them)
  • A slogan emerged from one project meeting (and
    made it onto the cover of the final evaluation
    report)
  • EFFECTS not so much a framework as an agenda
    for change
  • This sense of political action was important a
    theme for todays seminar

6
Setting the scene
  • Two EFFECTS outcomes for this afternoon
  • Outcome 5 Evaluated impact of the interventions
  • This will include evidence that you have
  • Evaluated the impact of the incorporation of
    technology on students and colleagues. Maintained
    an awareness of external changes and made
    adaptations as necessary.
  • Outcome 6 Disseminated the findings of the
    evaluation
  • This will include evidence that you have
  • Provided feedback for students and colleagues and
    disseminated experience and findings to
    department or more widely.

7
Before we give you information
  • We want to start by getting you thinking about
    evaluation and dissemination
  • First, think about different times when youve
    been involved in evaluation (either doing it or
    having it done to you!)
  • Then, write down
  • A brief description of the best experience youve
    had with evaluation
  • A brief description of the worst experience
    youve had with evaluation
  • Well spend about five minutes on this

8
But what is evaluation?
  • Its not one thing so theres no single
    definition
  • However, a useful starting point isThe process
    of making judgements about the worth (costs and
    values) of something
  • Also used to describe
  • Descriptive studies
  • Intervention studies (e.g. formative evaluation)
  • Empirical research
  • Monitoring
  • Quality Assurance processes

9
But what is evaluation?
  • What do these things have in common?
  • Evaluators. Evaluation is what evaluators do.
    (A community of practice kind of definition)
  • Empiricism. Evaluation involves judgements about
    data.
  • For the most part, judgement (the value in
    evaluate)

10
But what is evaluation?
  • And what do these activities look like?
  • Data collection of various types
  • Interviews, surveys, finance spreadsheets, focus
    group transcripts, documents, tallies of
    promotions, emails
  • No inherent reason why it has to be done a
    particular way
  • Evaluation doesnt have to involve interviews, or
    randomised control trials, or
  • Choice of method depends on the person, the
    situation, the audience and so on

11
Utilization-focused evaluation
  • Evaluation in EFFECTS followed a particular
    tradition utilization-focused evaluation
    (Patton, 1997)
  • A philosophy that arose from
  • The realisation that no-one read evaluation
    reports
  • The feeling that the qualitative/quantitative
    paradigm war was never going to be solved
  • Argues that evaluations should be judged against
    how well they help people to do things
  • and not on whether they are good technically
    (e.g. valid), of a particular type (e.g.
    experimental) or by particular types of people
    (e.g. external evaluators)

12
Utilization-focused evaluation
  • Communication is key to this kind of evaluation
  • No matter how good the study, its useless if
    the people who need the information dont get it
    in time
  • Closely related to ideas of democratic and
    emancipatory evaluation
  • We need to understand how other people understand
    this situation
  • We need to get groups (e.g. policy makers,
    academics) talking to each other
  • Evaluation can provide a framework in which this
    can happen

13
Now back to you!
  • Spend a short while thinking
  • Has this made you think of any other experiences
    of evaluation?
  • Do these ideas help you to make sense of your
    experience of evaluation at all?
  • And spend a short while talking
  • With a convenient group of people, compare your
    experiences
  • Why were these good or bad? (What feature or
    quality made it a good or bad experience?)
  • Well gather some suggestions from you after a
    few minutes

14
But we want more
  • Think about any involvement youve had with
    project dissemination
  • As before, write down
  • A description of the best experience youve had
  • A description of the worse experience youve had
  • Take a few minutes to do this on your own

15
And whats dissemination?
  • Sounds like a stupid question
  • But if weve rejected the idea we can transmit
    learning to students, why do we persist in
    thinking we can just transmit findings to
    peers?
  • This is not about volume (how many people, how
    many papers, how loud you talk) its about making
    meaning

16
And whats dissemination?
  • So how can we make sure our messages are
    meaningful to people?
  • In EFFECTS, we tried
  • Giving out drafts to see what people thought,
    e.g. at workshop sessions in ALT-C, so other
    peoples voices were represented too
  • Running workshops so that the ideas could be
    discussed, not just presented
  • Working with people (partner sites) so that
    experiences could be shared and jointly
    interpreted
  • Building relationships with people so that they
    learnt how to interpret the kinds of things we
    offered them

17
And whats dissemination?
  • We also need to think about what counts as
    dissemination
  • Publishing a journal paper?
  • Giving a workshop?
  • Producing a leaflet?
  • A project team meeting?
  • Chatting to a colleague over coffee?
  • Moaning to your partner about work?
  • Formal project evaluation favours obvious
    forms, but studies of change in organisation
    suggest that invisible forms might be more
    effective
  • Whether theyre more powerful or not, theyre
    different

18
And whats dissemination?
  • A rhetorical question what method of
    dissemination would you find most meaningful
  • A paper about evaluating EFFECTS?
  • A presentation about evaluating EFFECTS?
  • Chatting with people who evaluated EFFECTS about
    their experience?
  • Being asked to take ideas from EFFECTS, relate
    them to what you do and compare this with others?
  • Were offering all of these because different
    people might respond differently to each

19
Back to you - again
  • Spend a short while thinking
  • Has this made you think of any other experiences
    of dissemination?
  • Do these ideas help you to make sense of your
    experience of dissemination at all?
  • And spend a short while talking
  • With a convenient group of people, compare your
    experiences
  • Why were these good or bad? (What feature or
    quality made it a good or bad experience?)
  • Well gather some suggestions from you after a
    few minutes

20
Drawing this first part together
  • Developing some principles
  • Weve collected qualities of good and bad
    evaluation, and of good and bad dissemination
  • How can we use this understanding to guide what
    we do?
  • You volunteer some principles for evaluation and
    then for dissemination and well record them

21
A tool for planning
  • The Evaluation Toolkit an online, step-by-step
    guide to help people plan evaluation
    dissemination activities
  • Provides layered guidance on the steps,
    process, associated resources issues for each
    stage
  • Consists of three stages planning, advising and
    presenting
  • Available online at http//www.ltss.bris.ac.uk/jca
    lt/

22
Evaluation Planner
  • Five steps
  • What are you evaluating?
  • Reasons
  • Context
  • Who is it for
  • Devising the question

23
Working through this
  • What are you evaluating?
  • Different kinds of evaluation, e.g. of a web
    site, a project, a strategy, a teaching
    innovation
  • Reasons why are you evaluating this?
  • Validation, monitoring, research, justification,
    improving, selecting, to provide evidence
  • Context
  • Scope and constraints of your evaluation
  • Who is it for?
  • Identifying key stakeholders, their needs and
    interests
  • Students, managers, funders, colleagues

24
Working through this
  • Devising the question
  • Using the previous steps, brainstorm different
    ways of formulating questions
  • Try to devise a range of question types (e.g.
    comparisons, contrasts, explorations, quantities,
    negatives)
  • Keep it simple
  • Focus on key stakeholders and key questions
  • Easy to get out of hand no more than 3
    stakeholders recommended!

25
Evaluation Adviser
  • Two (big) steps
  • Data capture
  • Choosing the methods to use
  • Describing how youre going to use this in
    practice (when, what with, under what
    constraints)
  • Data analysis
  • Choosing the methods to use
  • Describing how youre going to use this in
    practice (when, what with, under what constraints)

26
Working through this
  • Data capture methods
  • Mapping your evaluation questions to appropriate
    methods
  • Take account of your own level of expertise and
    available time
  • Be aware of what each approach was designed to do
    (dont use stats on a group of four people or try
    to interview 200)
  • Use a variety of methods to build a coherent
    picture (triangulation)

27
Working through this
  • Five common methods
  • Focus groups
  • A quick way of getting a range of views/ideas,
    good for exploration
  • Not necessarily representative can go off-topic,
    and individuals can dominate them
  • Interviews
  • A way to understand peoples experiences of
    things provides in-depth picture of individual
    views
  • Time consuming

28
Working through this
  • Surveys
  • Good broad overview of issues
  • Can be time consuming to analyse need to be
    careful when devising questions
  • Usage logs
  • Readily available
  • Need to be careful when interpreting what these
    mean
  • Experiments
  • Good to compare two things
  • Difficult to do controlled studies in educational
    settings can raise ethical issues

29
Working through this
  • Data analysis
  • Need to map methods to types of data
  • Be aware of your expertise and time
  • As before, be aware of what each approach can and
    cant do

30
Working through this
  • Four common examples
  • Grounded theory
  • Doesnt pre-suppose particular outcomes
  • Takes ages to do well, requires iterative data
    collection
  • Statistical analysis
  • Can use standard methods to analyse things
    quickly
  • You need to know what youre doing
  • Narrative case study
  • Gives a rich, contextual picture
  • Isnt generalisable
  • Pre-determined list of categories
  • Builds on previous research
  • May not map to this particular situation

31
Evaluation Presenter
  • Two steps
  • Closing the loop (reflecting on the process)
  • Presentation tools
  • Selecting the tools to use
  • Describing how youre going to use these in
    practice

32
Working through this
  • Six common presentation tools
  • Journal article
  • Academic credibility
  • Long lead-time, might only reach a narrow group
    of people
  • Newsletters
  • Quick
  • Disposable
  • Email lists
  • Quicker targeted to particular groups
  • May not be read

33
Working through this
  • Committee reports
  • Specific stakeholders can be used for political
    mileage
  • Counter-politics
  • Verbal presentation
  • Quick and easy to do, targeted to particular
    audiences (responsive)
  • Transient
  • Workshops
  • Allows you to work through issues in detail
  • Time consuming, reaching only small groups

34
Planning a study
  • Organise yourselves into small groups
  • Decide whose study to focus on
  • Look through the summary plan to see how its
    been described
  • Work through the steps of the toolkit, making
    notes about how your own study might look (about
    15 minutes)
  • Choose whether to try and work through the whole
    plan or whether to spend most of the time
    discussing particular sections
  • (Youll need to come back to this at the end of
    the session)
  • Two or three groups to volunteer to describe
    interesting features of their plans

35
Thinking strategically about evaluation
  • Think through individually
  • Key barriers and enablers individual,
    departmental, institutional, and external
  • Who are the key people and committees to target -
    Think of key people (internal and external),
    committees, etc., where power lies
  • Share experiences in pairs and with group

36
Thinking strategically about evaluation
  • Example drivers
  • Quality audit, institutional audit, learning
    teaching strategies, operational plans, new
    appointments of key people, external drivers
    (e.g. funding)
  • Current examples,
  • new academy (with Liz Beaty in place)
  • Learning and teaching strategies
  • Using research initiatives as a Trojan horse
  • Beware different strategies might work at
    different times also some will work in some
    institutions and not others!

37
Thinking strategically about dissemination
  • Think about different ways of disseminating
  • What formats to use
  • Who to target
  • Now think about when it would be most effective
    to do dissemination
  • Draw up relevant lifecycles (e.g. academic
    other internal lifecycles, relevant external
    events)
  • Consider how these can be targeted and used
  • Make a personal list for the study youve got in
    mind
  • Share your timetable in pairs, then with the group

38
Thinking strategically about dissemination
  • Some things to consider
  • These might be the same stakeholders as for
    evaluation, but they might not be!
  • Critical times of the year when to and when not
    to disseminate
  • Start of the year, exams, as part of other
    development activities/events
  • Think ahead of time, and work in things such as
    using external speakers
  • Not just about presenting e.g. input into
    strategic plans, operational plans, etc.
  • Indirect dissemination through others can be very
    effective!

39
Thinking personally about evaluation
  • Leaving aside what evaluation can do for your
    project
  • what can evaluation do for you?
  • At a personal level
  • What do you hope to learn?
  • What might you gain?
  • Who do you hope to persuade?
  • What problems might you cause?
  • Spend five minutes writing down a list this is
    one you dont have to share!

40
Thinking personally about evaluation
  • Evaluation is (should be) a learning experience
  • Its a chance for you to make connections with
    people
  • Its a chance to associate with (or criticise
    constructively!) a project
  • It can be a chance to build goodwill by giving
    good advice or helping solve problems for the
    project team
  • Reporting findings gets your name in front of
    funders/policy people/managers/committees
  • You might be able to publish something based on
    the study

41
Thinking personally about evaluation
  • These arent things that are often talked about
  • If youre an evaluator, you have power and
    opportunity so be honest about it!
  • The potential problem being professional, and
    conflicts of interest
  • Would any of these personal aspirations prevent
    you from doing your role well?
  • Would any affect timeliness, usefulness, how
    informative the evaluation was, etc.?
  • Which can you pursue, and which might you have to
    give up in order to do the best job for your
    clients?

42
Thinking personally about evaluation
  • So, back to your personal lists
  • Think creatively about what your evaluation might
    enable you to do. Are there other people this
    could help you meet or influence, for example?
  • Think about the tensions between your personal
    aspirations and what you might call your
    professional duty where might conflicts arise?
  • Revise your list of personal aspirations in light
    of these exercises
  • Are there any examples of things that havent
    been mentioned youd be willing to share? (Call
    them out!)

43
Being timely
  • Youve established aims for your project and for
    yourself
  • Youve thought about how youre going to gather
    the data
  • Youve thought about who wants to know what
  • So whens all this going to happen?

44
Being timely
  • Evaluation is time consuming
  • If youre lucky, youll have a useful plan by the
    end of today. Some plans take days of discussion
    time particularly those that are politically
    sensitive.
  • Gathering and analysing data takes longer than
    you think (e.g. 1 hr of interview taking 4 hrs
    transcription before analysis starts)
  • Writing can be time consuming, especially in teams

45
Being timely
  • Its not just the quantity of time, though
  • Do you need data from students?
  • When will they be able and willing to provide it?
  • Do they disappear just before exams, never to
    return?
  • Do you need data from staff?
  • Are they busy all term and/or absent all summer?
  • When do you have free time to analyse all this?
  • Can you set aside time as part of your job?
  • Do you need to make time by giving up other
    things?
  • When does it have to be done by?
  • Which committee will you report to, and when does
    it meet?

46
Being timely
  • Are there opportunities or problems on the
    horizon that might influence what you do?
  • People are often more interested in evaluations
    just before a quality audit
  • Documents might be useful in gaining points for
    your department if you can provide evidence of
    furthering strategic priorities or fitting with
    the learning teaching strategy
  • You might find a controversial course proposal is
    helped if you can append an evaluation of
    potential students needs so can you evaluate
    these in time?
  • Evidence of success might help in terms of
    gaining access to funding (internally or
    externally)

47
Being timely
  • Think about your study in terms of time for
    example
  • Is the volume of work youve planned realistic?
  • Is the timing of work youve planned practical?
  • Are there going to be any enforced delays?
  • Are there any important or immovable deadlines?
  • Do you need to catch particular people before
    they get busy/go on leave/leave, or want to wait
    until someone new is in post?
  • Use this to sketch out a plan for your study

48
How do I know impact when I see it?
  • Lots of talk about the process of judging, but so
    far not much talk about judgement itself
  • What counts as evidence of impact? (What do we
    mean by impact anyhow?) And when you come to
    it, what exactly is e-learning, or staff
    development, or teaching?
  • We cant make judgements without making
    assumptions so lets be honest about what were
    assuming

49
I was proceeding across campus in an orderly
fashion when
  • Another task for you to do
  • Imagine youre a detective, and you have been
    dispatched to an institution where the awful
    crime of staff development appears to have
    taken place
  • Your task is to make a case to prove that a
    particular person or project is responsible for
    doing this to staff
  • What evidence would you look for? How would you
    use this to argue guilt?
  • Spend five minutes on your own planning you
    investigation and case

50
I was proceeding across campus in an orderly
fashion when
  • Now get together in convenient groups
  • Spend a few minutes
  • Each of you present your case
  • The job of the listeners is to point out
    weaknesses in the case(Have you checked their
    alibi? What if they were covering up for someone
    else? Do they really know what theyre saying?)

51
The problem with evidence
  • Its not always easy to be convincing e.g.
  • Documenting that things have happened doesnt
    tell you why they happened
  • Documenting peoples reasons only gives you their
    (partial) perspective on a situation
  • Measuring things (e.g. exam performance) tells
    you nothing about things you might not have
    measured (e.g. learning)(An aside think about
    the rhetoric of models if they tell you that
    the world works a certain way, they stop you from
    looking at things that dont work that way)

52
The problem with evidence
  • The importance of triangulation
  • Any kind of evidence (interviews, surveys, etc.)
    only gives you part of the story
  • Any source of evidence (learners, academics,
    managers, etc.) only gives you part of the story
  • Comparing and synthesising across partial
    accounts gives you a fuller (but never full!)
    story
  • The importance of modest claims
  • Staff perceived that the workshops changed their
    lives
  • This study demonstrated that re-training staff
    improved retention. However, it may be that our
    model is too simplistic, and factors such as the
    cost of education also had a role to play, even
    though we could not consider this here.

53
Judging things
  • Wed like to hear
  • Some examples of convincing cases (and why you
    thought they were convincing)
  • Some examples of unconvincing cases (and why you
    thought they were unconvincing)

54
Drawing it all together
  • By way of a recap, weve covered
  • Definitions
  • Principles
  • Strategy (people and politics)
  • Personal politics
  • Timeliness
  • Judgement
  • Are there any other topics you want to raise for
    discussion at this point?

55
Drawing it all together
  • Time to make use of all this
  • Go back to the study plans you worked on
  • Discuss how you would change this in light of
    this afternoons work

56
So, what have you learnt?
  • Wed like you to pause, then share in groups
  • Any major changes (in terms of focus and
    approach) in the study you were thinking about
  • Anything youve learnt that you didnt expect
  • Any revelations youve had about your personal
    situation, and how to develop it
  • Well then ask groups to share some examples with
    everyone

57
The outputs from today
  • What do we think you should have got from this?
  • Formal stuff papers, overheads
  • Generated stuff experiences of evaluation,
    experiences of dissemination, principles for
    these, a list of personal aspirations
  • Personal stuff the plans (for projects and for
    personal aspirations) that youve produced
  • Intangible stuff the contacts over coffee or
    from group work, the discussions that youve had,
    the concepts youve acquired and will take away

58
The outputs from today
  • What evidence do we have (at least in theory!)
    that youve learnt something from this?
  • The things you just told us youd learnt!
  • Revisions of your lists evidence youve changed
    your understanding and beliefs
  • Production of outputs our co-construction of
    understanding (e.g. of principles of good
    evaluation)
  • So could we claim that this session has developed
    staff?
  • and on that note well call a halt!
Write a Comment
User Comments (0)
About PowerShow.com