Consider the Evidence - PowerPoint PPT Presentation

About This Presentation
Title:

Consider the Evidence

Description:

Consider the Evidence Evidence-driven decision making for secondary schools A resource to assist schools to review their use of data and other evidence – PowerPoint PPT presentation

Number of Views:111
Avg rating:3.0/5.0
Slides: 125
Provided by: Bill1245
Category:

less

Transcript and Presenter's Notes

Title: Consider the Evidence


1
Consider the Evidence
  • Evidence-driven decision making
  • for secondary schools
  • A resource to assist schools
  • to review their use of data and other evidence
  • 1

2
Evidence-driven decision making
  • Today we aim to
  • think about how we use data and other evidence to
    improve teaching, learning and student
    achievement
  • improve our understanding, confidence and
    capability in using data to improve practice
  • discuss how we make decisions
  • think about our needs and start to plan our own
    evidence-based projects

3
Evidence-driven eating
  • You need to buy lunch. Before you decide what to
    buy you consider a number of factors
  • how much money do you have?
  • what do you feel like eating?
  • what will you be having for dinner?
  • how far do you need to go to buy food?
  • how much time do you have?
  • where are you going to eat it?

4
Evidence-driven teaching
  • I had a hunch that Ana wasnt doing as well as
    she could in her research assignments, a major
    part of the history course. What made me think
    this?
  • Anas general work (especially her writing) was
    fine. She made perceptive comments in class,
    contributed well in groups and had good results
    overall last year, especially in English.
  • How did I decide what to do about it?
  • I looked more closely at her other work. I
    watched her working in the library one day to see
    if it was her reading, her use of resources, her
    note taking, her planning, or what. At morning
    tea I asked one of Anas other teachers about
    Anas approach to similar tasks. I asked Ana if
    she knew why her research results werent as good
    as her other results, and what her plans were for
    the next assignment.
  • I thought about all of this and planned a course
    of action. I gave her help with using indexes,
    searching, note taking and planning and linking
    the various stages of her research.

5
Consider the Evidence
  • A resource to assist schools
  • to review their use of data and other evidence
  • What is meant by data and other evidence?

6
Evidence
  • Any facts, circumstances or perceptions that can
    be used as an input for an analysis or decision
  • how classes are compiled, how classes are
    allocated to teachers, test results, teachers
    observations, attendance data, portfolios of
    work, student opinions
  • Data are one form of evidence

7
Data
  • Known facts or measurements, probably expressed
    in some systematic or symbolic way (e.g. as
    numbers)
  • assessment results, gender, attendance, ethnicity
  • Data are one form of evidence

8
Which factors are data?
  • Evidence to consider before buying lunch
  • how much money you have
  • what you feel like eating
  • what youll be having for dinner
  • how far you need to go to buy food
  • how much time you have
  • where youre going to eat
  • what your diet allows

9
Evidence-driven decision making
  • We have more evidence about what students know
    and can do than ever before their achievements,
    behaviours, environmental factors that influence
    learning
  • We should
  • draw on all our knowledge about the learning
    environment to improve student achievement
  • explore what lies behind patterns of achievement
  • decide what changes will make a difference

10
What evidence does a school have?
  • Demographics
  • Student achievement
  • Perceptions
  • School processes
  • Other practice

11
Demographics
  • What data do we have now to provide a profile of
    our school?
  • What other data could we create?
  • School
  • Students
  • Staff
  • Parents/caregivers and community

12
Demographics
  • Data that provides a profile of our school
  • School decile, roll size, urban/rural, single
    sex or co-educational, teaching spaces
  • Students ethnicity, gender, age, year level,
    attendance, lateness, suspension and other
    disciplinary data, previous school, part-time
    employment
  • Staff gender, age, years of experience,
    qualifications, teaching areas, involvement in
    national curriculum and assessment, turnover rate
  • Parents/caregivers and community
    socio-economic factors, breadth of school
    catchment, occupations

13
Student achievement
  • What evidence do we have now about student
    achievement?
  • What other evidence could we collect?
  • National assessment results
  • Standardised assessment results administered
    internally
  • Other in-school assessments
  • Student work

14
Student achievement
  • Evidence about student achievement
  • National assessment results - NCEA, NZ
    Scholarship - details like credits above and
    below year levels, breadth of subjects entered
  • Standardised assessment results administered
    internally - PAT, asTTle
  • Other in-school assessments - most
    non-standardised but some, especially within
    departments, will be consistent across classes -
    includes data from previous schools,
    primary/intermediate
  • Student work - work completion rates, internal
    assessment completion patterns, exercise books,
    notes, drafts of material - these can provide
    useful supplementary evidence

15
Perceptions
  • What evidence do we have now about what
    students, staff and others think about the
    school?
  • Are there other potential sources?
  • Self appraisal
  • Formal and informal observations made by teachers
  • Structured interactions
  • Externally generated reports
  • Student voice
  • Other informal sources

16
Perceptions
  • Evidence about what students, staff, parents and
    the community think about the school
  • Self appraisal - student perceptions of their own
    abilities, potential, achievements, attitudes
  • Formal and informal observations made by teachers
    - peer interactions, behaviour, attitudes,
    engagement, student-teacher relationships,
    learning styles, classroom dynamics
  • Structured interactions - records from student
    interviews, parent interviews, staff conferences
    on students
  • Externally generated reports - from ERO and NZQA
    (these contain data but also perceptions)
  • Student voice - student surveys, student council
    submissions
  • Other informal sources views about the school
    environment, staff and student morale, board
    perceptions, conversations among teachers

17
School processes
  • What evidence do we have about how our school is
    organised and operates?
  • Timetable
  • Classes
  • Resources
  • Finance
  • Staffing

18
School processes
  • Evidence about how our school is organised and
    operates
  • School processes - evidence and data about how
    your school is organised and operates, including
  • Timetable structure, period length, placement of
    breaks, subjects offered, student choices,
    tertiary and workforce factors, etc
  • Classes - how they are compiled, their
    characteristics, effect of timetable choices, etc
  • Resources - access to libraries, text books, ICT,
    special equipment, etc
  • Finance - how the school budget is allocated, how
    funds are used within departments, expenditure on
    professional development
  • Staffing - policies and procedures for employing
    staff, allocating responsibility, special roles,
    workload, subjects and classes

19
Other practice
  • How can we find out about what has worked (or
    not) in other schools?

20
Other practice
  • How we can find out about what has worked in
    other schools?
  • Documented research university and other
    publications, Ministry of Educations Best
    Evidence Syntheses, NZCER, NZARE, overseas
    equivalents
  • Experiences of other schools informal contacts,
    local clusters, advisory services, TKI LeadSpace

21
What can we do with evidence?
  • Shanes story
  • A history HOD wants to see whether history
    students are performing to their potential.
  • She prints the latest internally assessed NCEA
    records for history students across all of their
    subjects. As a group, history students seem to be
    doing as well in history as they are in other
    subjects.
  • Then she notices that Shane is doing very well in
    English and only reasonably well in history. She
    wonders why, especially as both are language-rich
    subjects with many similarities.
  • The HOD speaks with the history teacher, who says
    Shane is attentive, catches on quickly and
    usually does all work required. He mentions that
    Shane is regularly late for class, especially on
    Monday and Thursday. So he often misses important
    information or takes time to settle in. He has
    heard there are problems at home so has
    overlooked it, especially as the student is doing
    reasonably well in history. contd
    ...

22
Shanes story contd
  • The HOD looks at the timetable and discovers that
    history is Period 1 on Monday and Thursday. She
    speaks to Shanes form teacher who says that she
    suspects Shane is actually late to school
    virtually every day. They look at centralised
    records. Shane has excellent attendance but
    frequent lateness to period 1 classes.
  • The HOD speaks to the dean who explains that
    Shane has to take his younger sister to school
    each morning. He had raised the issue with Shane
    but he said this was helping the household get
    over a difficult period and claimed he could
    handle it.
  • The staff involved agree that Shanes regular
    lateness is having a demonstrable impact on his
    achievement, probably beyond history but not so
    obviously.
  • The dean undertakes to speak to the student,
    history teacher, and possibly the parents to find
    a remedy for the situation.

23
Thinking about Shanes story
  • What were the key factors in the scenario about
    Shane?
  • What types of data and other evidence were used?
  • What questions did the HOD ask?
  • What happened in this case that wouldnt
    necessarily happen in some schools?

24
Shanes story - keys to success
  • The history HOD looked at achievement data in
    English and history.
  • She looked for something significant across the
    two data sets, not just low achievement.
  • Then she asked a simple question Why is there
    such a disparity between in these two subjects
    for that student?
  • She sought information and comments (perceptions
    evidence and data) from all relevant staff.
  • The school had centralised attendance and
    punctuality records (demographic data) that form
    teacher could access easily.
  • The action was based on all available evidence
    and designed to achieve a clear aim.

25
Evidence-driven strategic planning
  • If we use evidence-driven decision making to
    improve student achievement and enhance teaching
    practice
  • it follows that strategic planning across the
    school should also be evidence-driven.

26
Evidence-driven strategic planning
  • .

27
The evidence-driven decision making cycle
  • Trigger
  • Explore
  • Question
  • Assemble
  • Analyse
  • Interpret
  • Intervene
  • Evaluate
  • Reflect

28
The evidence-driven decision making cycle
  • Trigger Clues found in data, hunches
  • Explore Is there really an issue?
  • Question What do you want to know?
  • Assemble Get all useful evidence together
  • Analyse Process data and other evidence
  • Interpret What information do you have?
  • Intervene Design and carry out action
  • Evaluate What was the impact?
  • Reflect What will we change?

29
The evidence-driven decision making cycle
  • .

30
The evidence-driven decision making cycle
  • .

31
The evidence-driven decision making cycle
  • .

32
Evaluate and reflect
  • Summative evaluation assess how successful the
    intervention was decide how our practice will
    change report to board
  • Formative evaluation at every stage in the
    cycle we reflect and evaluate
  • Are we are on the right track?
  • Do we need to fine-tune?
  • Do we actually need to complete this?

33
Types of analysis
  • We can compare achievement data by subject or
    across subjects for
  • an individual student
  • groups of students
  • whole cohorts
  • The type of analysis we use depends on the
    question we want to answer

34
Inter-subject analysis
  • Have my students not achieved a particular
    history standard because they have poor formal
    writing skills, rather than poor history
    knowledge?

35
Intra-subject analysis
  • What are the areas of strength and weakness in my
    own teaching of this class?

36
Longitudinal analysis
  • Are we producing better results over time in year
    11 biology?

37
The evidence-driven decision making cycle
  • gt Trigger Clues found in data, hunches
  • Explore Is there really an issue?
  • Question What do you want to know?
  • Assemble Get all useful evidence together
  • Analyse Process data and other evidence
  • Interpret What information do you have?
  • Intervene Design and carry out action
  • Evaluate What was the impact?
  • Reflect What will we change?

38
Asking questions
  • Evidence-driven decision making starts with
    asking good questions
  • You can tell whether a man is clever by his
    answers. You can tell whether he is wise by his
    questions.
  • Nobel Prize winner, Naguib Mahfouz

39
Trigger questions
  • How good/poor is ?
  • What aspects of are good/poor?
  • Is actually changing?
  • How is changing?
  • Is better than last year?
  • How can be improved?
  • Why is good/poor?
  • What targets are reasonable for ?
  • What factors influence the situation for ?
  • What would happen if we ?
  • Formative or summative?

40
Summative questions
  • A target in the schools annual plan is for all
    year 10 boys to improve their writing level by at
    least one level using asTTle (e.g. from 4B to
    4A).
  • Have all year 10 boys improved by at least one
    asTTle level in writing?

41
Questions about policy
  • We have been running 60-minute periods for 5
    years now.
  • What effect has the change had?

42
Formative questions from data
  • The data suggest our students are achieving well
    in A, but less well in B.
  • What can we do about that?

43
Formative questions from data
  • A significant proportion of our school leavers
    enrol in vocational programmes at polytechnic or
    on-job.
  • How well do our school programmes prepare those
    students?

44
Questions from hunches
  • I suspect this poor performance is being caused
    by
  • Is this true?
  • We reckon results will improve if we put more
    effort into ...
  • Is this likely?
  • I think wed get better results from this module
    if we added
  • Is there any evidence to support this idea?

45
Hunches from raw data
  • .

46
Hunches from raw data
  • Is the class as a whole doing better in
    internally assessed standards than in externally
    assessed standards? If so, why?
  • Are the better students (with many Excellence
    results) not doing as well in external
    assessments as in internal? If so, why?
  • Is there any relationship between absences and
    achievement levels? It seems not, but its worth
    analysing the data to be sure.

47
The evidence-driven decision making cycle
  • Trigger Clues found in data, hunches
  • gt Explore Is there really an issue?
  • Question What do you want to know?
  • Assemble Get all useful evidence together
  • Analyse Process data and other evidence
  • Interpret What information do you have?
  • Intervene Design and carry out action
  • Evaluate What was the impact?
  • Reflect What will we change?

48
Question Explore Question
  • It looks like our students are doing well in A
    but not in B. What can we do about it?
  • EXPLORE what else should we be asking?
  • Is this actually the case?
  • Is there anything in the data to suggest what we
    could do about it?

49
Question Explore Question
  • We have been running 60-minute periods for a
    year now. Did the change achieve the desired
    effects?
  • EXPLORE what else should we be asking?
  • How has the change impacted on student
    achievement?
  • Has the change has had other effects?
  • Is there more truancy?
  • Is more time being spent in class on
    assignments, rather than as homework?

50
The evidence-driven decision making cycle
  • Trigger Clues found in data, hunches
  • Explore Is there really an issue?
  • gt Question What do you want to know?
  • Assemble Get all useful evidence together
  • Analyse Process data and other evidence
  • Interpret What information do you have?
  • Intervene Design and carry out action
  • Evaluate What was the impact?
  • Reflect What will we change?

51
A very good question
  • Specific and with a clear purpose
  • Able to be investigated through looking at data
    and other evidence
  • Likely to lead to information on which we can act

52
Questions with purpose
  • What do we know about reported bullying
    incidents for year 10 students?
  • MAY BE BETTER AS
  • Who has been bullying whom? Where?
  • What are students telling us?
  • What does pastoral care data tell us? Were some
    interventions more effective with some groups of
    students than others?

53
Write more purposeful questions
  • What are the attendance rates for year 11
    students?
  • What has been the effect of the new 6-day x
    50-min period structure?
  • How well are boys performing in formal writing in
    year 9?
  • What has been the effect of shifting the lunch
    break to after period 4?

54
More purposeful questions
  1. How do year 11 attendance rates compare with
    other year levels? Do any identifiable groups of
    year 11 students attend less regularly than
    average?
  2. Is the new 6-day x 50-min period structure having
    any positive effect on student engagement levels?
    Is it influencing attendance patterns? What do
    students say?
  3. Should we be concerned about boys writing? If
    so, what action should we be taking to improve
    the writing of boys in terms of the literacy
    requirements for NCEA Level 1?
  4. The new timing of the lunch break was intended to
    improve student engagement levels after lunch.
    Did it achieve this? If so, did improvements in
    student engagement improve student achievement?
    Do the benefits outweigh any disadvantages?

55
The evidence-driven decision making cycle
  • Trigger Clues found in data, hunches
  • Explore Is there really an issue?
  • Question What do you want to know?
  • gt Assemble Get all useful evidence together
  • Analyse Process data and other evidence
  • Interpret What information do you have?
  • Intervene Design and carry out action
  • Evaluate What was the impact?
  • Reflect What will we change?

56
Assembling the evidence
  • We want to know if our senior students are doing
    better in one area of NCEA biology than another.
  • So we need NCEA results for our cohort.
  • It could be that all biology students do better
    in this area than others.
  • So we also need data about national
    differences across the two areas.

57
Are our data any good?
  • A school found that a set of asTTle scores
    indicated that almost all students were achieving
    at lower levels than earlier in the year.
  • Then they discovered that the first test had
    been conducted in the morning, but the later test
    was in the afternoon and soon after the students
    had sat a two-hour exam.

58
Think critically about data
  • Was the assessment that created this data
    assessing exactly what we are looking for?
  • Was the assessment set at an appropriate level
    for this group of students?
  • Was the assessment properly administered?
  • Are we comparing data for matched groups?

59
Cautionary tale 1
  • You want to look at changes in a cohorts asTTle
    writing levels over 12 months.
  • Was the assessment conducted at the same time
    both years?
  • Was it administered under the same conditions?
  • Has there been high turnover in the cohort?
  • If so, will it be valid to compare results?

60
Cautionary tale 2
  • You have data that show two classes have
    comparable mathematics ability. But end-of-year
    assessments show one class achieved far better
    than the other.
  • What could have caused this?
  • Was the original data flawed? How did teaching
    methods differ? Was the timetable a factor? Did
    you survey student views? Are the classes
    comparable in terms of attendance, etc?

61
The evidence-driven decision making cycle
  • Trigger Clues found in data, hunches
  • Explore Is there really an issue?
  • Question What do you want to know?
  • Assemble Get all useful evidence together
  • gt Analyse Process data and other evidence
  • Interpret What information do you have?
  • Intervene Design and carry out action
  • Evaluate What was the impact?
  • Reflect What will we change?

62
Analysing data and other evidence
  • Schools need some staff members who are
    responsible for leading data analysis
  • Schools have access to electronic tools to
    process data into graphs and tables
  • All teachers do data analysis
  • Data is not an end in itself its one of the
    many stages along the way to evidence-driven
    decision making

63
Basic analysis
  • .

64
Basic analysis
  • Divide the class into three groups on the basis
    of overall achievement
  • Identify students who are doing so well at level
    2 that they could be working at a higher level
  • Find trends for males and females, those who are
    absent often, or have many detentions
  • Compare this groups external assessment success
    rate with the national cohort.

65
Reading levels terms 1 and 4
  • .

66
Making sense of the results
  • Think about significance and confidence
  • How significant are any apparent trends?
  • How much confidence can we have in the
    information?

67
Making sense of the results
  • This table shows that
  • reading levels overall
  • were higher in term 4
  • than in term 1.
  • Scores improved for most students.
  • 20 of students moved into level 5.
  • But the median score is still 4A.
  • Is this information? Can we act on it?

68
Information
  • Knowledge gained from analysing data and making
    meaning from evidence.
  • Information is knowledge (or understanding) that
    can inform your decisions.
  • How certain you will be about this knowledge
    depends on a number of factors where your data
    came from, how reliable it was, how rigorous your
    analysis was.
  • So the information you get from analysing data
    could be a conclusion, a trend, a possibility.

69
Information
  • Summative information is useful for reporting
    against targets and as general feedback to
    teachers.
  • Formative information is information we can act
    on it informs decision-making that can improve
    learning.

70
Questions to elicit information
  • Did the more able students make significant
    progress, but not the lower quartile?
  • How have the scores of individual students
    changed?
  • How many remain on the same level?
  • How much have our teaching approaches contributed
    to this result?
  • How much of this shift in scores is due to
    students predictable progress? Is there any data
    that will enable us to compare our students with
    a national cohort?
  • How does this shift compare with previous Year 9
    cohorts?

71
Reading levels terms 1 and 4
  • .

72
Words, words, words
  • Information can establish, indicate, confirm,
    reinforce, back up, stress, highlight, state,
    imply, suggest, hint at, cast doubt on, refute
  • Does this confirm that ?
  • What does this suggest?
  • What are the implications of ?
  • How confident are we about this conclusion?

73
The evidence-driven decision making cycle
  • Trigger Clues found in data, hunches
  • Explore Is there really an issue?
  • Question What do you want to know?
  • Assemble Get all useful evidence together
  • Analyse Process data and other evidence
  • gt Interpret What information do we have?
  • Intervene Design and carry out action
  • Evaluate What was the impact?
  • Reflect What will we change?

74
Making sense of information
  • Data becomes information when it is categorised,
    analysed, summarised and placed in context.
  • Information therefore is data endowed with
    relevance and purpose.
  • Information is developed into knowledge when it
    is used to make comparisons, assess consequences,
    establish connections and engage in dialogue.
  • Knowledge can be seen as information that
    comes laden with experience, judgment, intuition
    and values.
  • Empson (1999) cited in Mason (2003)

75
Interrogate the information
  • Is this the sort of result we envisaged? If not,
    why?
  • How does this information compare with the
    results of other research or the experiences of
    other schools?
  • Are there other variables that could account for
    this result?
  • Should we set this information alongside other
    data or evidence to give us richer information?
  • What new questions arise from this information?

76
Interrogate the information
  • Does this relate to student achievement - or does
    it actually tell us something about our teaching
    practices?
  • Does this information suggest that the schools
    strategic goals and targets are realistic and
    achievable? If not, how should they change, or
    should we change?
  • Does the information suggest we need to modify
    programmes or design different programmes?
  • Does the information suggest changes need to be
    made to school systems?

77
Interrogate the information
  • What effect is the new 6-day x 50-min period
    structure having on student engagement levels?

78
Interrogate the information
  • What effect is the new 6-day x 50-min period
    structure having on student engagement levels?
  • Do student views align with staff views?
  • Do positive effects outweigh negative effects?
  • Is there justification for reviewing the policy?
  • Does the information imply changes need to be
    made to teaching practices or techniques?
  • Does the information offer any hint about what
    sort of changes might work?

79
The evidence-driven decision making cycle
  • Trigger Clues found in data, hunches
  • Explore Is there really an issue?
  • Question What do you want to know?
  • Assemble Get all useful evidence together
  • Analyse Process data and other evidence
  • Interpret What information do you have?
  • gt Intervene Design and carry out action
  • Evaluate What was the impact?
  • Reflect What will we change?

80
Professionals making decisions
  • How do we decide what action to take as result
    of the information we get from the analysis?
  • We use our professional judgment.

81
Professional decision making
  • We have evidence-based information that we see
    as reliable and valid
  • What do we do about it?
  • If the information indicates a need for action,
    we use our collective experience to make a
    professional decision

82
Professionals making decisions
  • Have my students not achieved a particular
    history standard because they have poor formal
    writing skills, rather than poor history
    knowledge?
  • The answer was Yes ... so I need to think
    about how to improve their writing skills. How
    will I do that?

83
Professionals making decisions
  • Do any particular groups of year 11 students
    attend less regularly than average for the whole
    cohort?
  • The analysis identified two groups so I need
    to think about how to deal with irregular
    attendance for each group.
  • How will I do that?

84
Professionals making decisions
  • You asked what factors are related to poor
    student performance in formal writing.
  • The analysis suggested that poor homework habits
    have a significant impact on student writing.
  • You make some professional judgements and decide
  • Students who do little homework dont write
    enough
  • You could take action to improve homework habits
    - but youve tried that before and the success
    rate is low
  • You have more control over other factors like
    how much time you give students to write in class
  • So you conclude the real need is to get
    students to write more often

85
Deciding on an action
  • Information will often suggest a number of
    options for action. How do we decide which action
    to choose?
  • We need to consider
  • what control we have over the action
  • the likely impact of the action
  • the resources needed

86
Planning for action
  • Is this a major change to policy or processes?
  • What other changes are being proposed
  • How soon can you make this change?
  • How will you achieve wide buy-in?
  • What time and resources will you need?
  • Who will co-ordinate and monitor implementation?

87
Planning for action
  • Is this an incremental change? Or are you just
    tweaking how you do things?
  • How will you fit the change into your regular
    work?
  • When can you start the intervention?
  • Will you need extra resources?
  • How will this change affect other things you do?
  • How will you monitor implementation?

88
Timing is all
  • How long should we run the intervention before we
    evaluate it?
  • When is the best time of the year to start (and
    finish) in terms of measuring changes in student
    achievement?
  • How much preparation time will we need to get
    maximum benefit?

89
Planning for evaluation
  • We are carrying out this action to see what
    impact it has on student achievement
  • We need to decide exactly how well know how
    successful the intervention has been
  • To do this we will need good baseline data

90
Planning for evaluation
  • What evidence do we need to collect before we
    start?
  • Do we need to collect evidence along the way, or
    just at the end?
  • How can we be sure that any assessment at the end
    of the process will be comparable with assessment
    at the outset?
  • How will we monitor any unintended effects?
  • Dont forget evidence such as timetables, student
    opinions, teacher observations

91
The evidence-driven decision making cycle
  • Trigger Clues found in data, hunches
  • Explore Is there really an issue?
  • Question What do you want to know?
  • Assemble Get all useful evidence together
  • Analyse Process data and other evidence
  • Interpret What information do you have?
  • Intervene Design and carry out action
  • gt Evaluate What was the impact?
  • Reflect What will we change?

92
Evaluate the impact of our action
  • Did the intervention improve the situation that
    triggered the process?
  • If the aim was to improve student achievement,
    did that happen?


93
Evaluate the impact of our action
  • Was any change in student achievement
    significant?
  • What else happened that we didnt expect?
  • How do our results compare with other similar
    studies we can find?
  • Does the result give us the confidence to make
    the change permanent?

94
Evaluate the impact of our action
  • A school created a new year 13 art programme. In
    the past students had been offered standard
    design and painting programmes, internally and
    externally assessed against the full range of
    achievement standards. Some students had to
    produce two folios for assessment and were unsure
    of where to take their art after leaving school.
  • The new programme blended drawing, design and
    painting concepts and focused on electronic
    media. Assessment was against internally assessed
    standards only.

95
Evaluate the impact of our action
  • Did students complete more assessments?
  • Were students gain more national assessment
    credits?
  • How did student perceptions of workload and
    satisfaction compare with teacher perceptions
    from the previous year?
  • Did students leave school with clearer intentions
    about where to go next with their art than the
    previous cohort?
  • How did teachers and parents feel about the
    change?

96
Evaluate the intervention
  • How well did we design and carry out the
    intervention? Would we do anything differently if
    we did it again?
  • Were our results affected by anything that
    happened during the intervention period - within
    or beyond our control?
  • Did we ask the right question in the first place?
    How useful was our question?
  • How adequate were our evaluation data?

97
Think about the process
  • Did we ask the right question in the first place?
    How useful was our question?
  • Did we select the right data? Could we have used
    other evidence?
  • Did the intervention work well? Could we have
    done anything differently?
  • Did we interpret the data-based information
    correctly?
  • How adequate were our evaluation data?
  • Did the outcome justify the effort we put into it?

98
The evidence-driven decision making cycle
  • Trigger Clues found in data, hunches
  • Explore Is there really an issue?
  • Question What do you want to know?
  • Assemble Get all useful evidence together
  • Analyse Process data and other evidence
  • Interpret What information do you have?
  • Intervene Design and carry out action
  • Evaluate What was the impact?
  • gt Reflect What will we change?

99
Future practice
  • What aspects of the intervention will we embed in
    future practice?
  • What aspects of the intervention will have the
    greatest impact?
  • What aspects of the intervention can we maintain
    over time?
  • What changes can we build into the way we do
    things in our school?
  • Would there be any side-effects?

100
Future directions
  • What professional learning is needed? Who would
    most benefit from it?
  • Do we have the expertise we need in-house or do
    we need external help?
  • What other resources do we need?
  • What disadvantages could there be?
  • When will we evaluate this change again?

101
Consider the Evidence
  • Terminology

102
Terminology
  • Terminology used in the
  • evidence-driven decision making cycle
  • Trigger Clues found in data, hunches
  • Explore Is there really an issue?
  • Question What do you want to know?
  • Assemble Get all useful evidence together
  • Analyse Process data and other evidence
  • Interpret What information do you have?
  • Intervene Design and carry out action
  • Evaluate What was the impact?
  • Reflect What will we change?

103
Trigger
  • Data, ideas, hunches, etc that set a process in
    action.
  • The trigger is whatever it is that makes you
    think there could be an opportunity to improve
    student achievement. You can routinely scan
    available data looking for inconsistencies, etc.
    It can be useful to speculate about possible
    causes or effects - and then explore data and
    other evidence to see if there are any grounds
    for the speculation.

104
Explore
  • Initial data, ideas or hunches usually need some
    preliminary exploration to pinpoint the issue and
    suggest good questions to ask.

105
Question
  • This is the key point what question/s do you
    want answered. Questions can raise an issue
    and/or propose a possible solution.

106
Assemble
  • Get together all the data and evidence you might
    need some will already exist and some will have
    to be generated for the occasion.

107
Analyse
  • Process sets of data and relate them to other
    evidence.
  • You are looking for trends and results that will
    answer your questions (but watch out for
    unexpected results that might suggest a new
    question).

108
Interpret
  • Think about the results of the analysis and
    clarify the knowledge and insights you think you
    have gained.
  • Interrogate the information. Its important to
    look at the information critically. Was the data
    valid and reliable enough to lead you to firm
    conclusions? Do the results really mean what they
    seems to mean? How sure are you about the
    outcome? What aspects of the information lead to
    possible action?

109
Intervene
  • Design and implement a plan of action designed
    to change the situation you started with.
  • Be sure that your actions are manageable and
    look at the resourcing needed. Consider how
    youll know what has been achieved.

110
Evaluate
  • Using measures you decided in advance, assess
    how successful the intervention has been.
  • Has the situation that triggered the process
    been improved? What else happened that you maybe
    didnt expect?

111
Reflect
  • Think about what has been learned and discovered
    and what practices you will change as a
    consequence.
  • What did we do that worked? Did this process
    suggest anything that we need to investigate
    further? What aspects of the intervention can be
    maintained? What support will we need?

112
Terminology
  • Other terms used
  • in
  • Consider the Evidence

113
Terminology
  • Analysis
  • A detailed examination of data and evidence
    intended to answer a question or reveal
    something.
  • This simplistic definition is intended to point
    out that data analysis is not just about
    crunching numbers - its about looking at data
    and other evidence in a purposeful way, applying
    logic, creativity and critical thinking to see if
    you can find answers to your questions or reveal
    a need. For example, you can carry out a
    statistical analysis of national assessment
    results in the various strands of English across
    all classes at the same level. You could compare
    those results with attendance patterns. But you
    might also think about those results in relation
    to more subjective evidence - such as how each
    teacher rates his/her strengths in teaching the
    various strands.

114
Terminology
  • Aggregation
  • A number of measures made into one.
  • This is a common and important concept in
    dealing with data. A single score for a test that
    contains more than one question is an aggregation
    - two or more results have been added to get a
    single result. Aggregation is useful when you
    have too few data to create a robust measure or
    you want to gain an overview of a situation. But
    aggregation can blur distinctions that could be
    informative. So you will often want to
    disaggregate some data to take data apart to
    see what you can discover from the component
    parts. For example, a student may do moderately
    well across a whole subject, but you need to
    disaggregate the years result to see where her
    weaknesses lie.

115
Terminology
  • Data
  • Known facts or measurements, probably expressed
    in some systematic or symbolic way (eg as
    numbers).
  • Data are codified evidence. (The word is used as
    a plural noun in this kit.) The concepts of
    validity and reliability apply to data. It helps
    to know where particular data came from how data
    were collected and maybe processed before you
    received them. Some data (eg attendance figures)
    will come from a known source that you have
    control of and feel you understand and can rely
    on. Other data (eg standardised test results)
    come from a source you might not really
    understand they may be subject to manipulation
    and predetermined criteria or processes (like
    standards or scaling). Some data (eg personality
    profiles) may be presented as if they are sourced
    in an objective way but their reliability might
    be variable.

116
Terminology
  • Demographics
  • Data relating to characteristics of groups
    within the schools population. Data that
    provides a profile of people at your school.
  • You will have the usual data relating to your
    students (gender, ethnicity, etc) and your staff
    (gender, ethnicity, years of experience, etc).
    Some schools collect other data, such as the
    residential distribution of students and
    parental occupations.

117
Terminology
  • Disaggregation
  • See aggregation
  • When you disaggregate data, you take aggregated
    data apart to see what you can discover from the
    component parts. For example, a student may do
    moderately well across a whole subject, but you
    need to disaggregate the years result to see
    where her weaknesses lie.

118
Terminology
  • Evaluation
  • Any process of reviewing or making a judgement
    about a process or situation.
  • In this resource, evaluation is used in two
    different but related ways. After you have
    analysed data and taken action to change a
    situation, you will carry out an evaluation to
    see how successful you have been - this is
    summative evaluation. But you are also encouraged
    to evaluate at every step of the way - when you
    select data, when you decide on questions, when
    you consider the results of data analysis, when
    you decide what actions to take on the basis of
    the data - this is called formative evaluation.

119
Terminology
  • Evidence
  • Any facts, circumstances or perceptions that can
    be used as an input for an analysis or decision.
  • For example, the way classes are compiled, how a
    timetable is structured, how classes are
    allocated to teachers, student portfolios of
    work, student opinions. These are not data,
    because they are not coded as numbers, but they
    can be factors in shaping teaching and learning
    and should be taken into account whenever you
    analyse data and when you decide on action that
    could improve student achievement.

120
Terminology
  • Information
  • Knowledge gained from analysing data and making
    meaning from evidence.
  • Information is knowledge (or understanding) that
    can inform your decisions. How certain you will
    be about this knowledge depends on a number of
    factors where your data came from, how reliable
    it was, how rigorous your analysis was. So the
    information you get from analysing data could be
    a conclusion, a trend, a possibility.

121
Terminology
  • Inter-subject analysis
  • A detailed examination of data and evidence
    gathered from more than one learning area.
  • Inter subject analysis can answer questions or
    reveal trends about students or teaching
    practices that are common to more than one
    learning area. For example, analysing the results
    of students taking mathematics and physics
    subjects can indicate the extent to which
    achievements in physics are aided or impeded by
    the students mathematical skills.

122
Terminology
  • Intervention
  • Any action that you take to change a situation,
    generally following an analysis of data and
    evidence.
  • This term is useful as it emphasises that to
    change students achievement, you will have to
    change something about the situation that lies
    behind achievement or non-achievement. You will
    take action to interrupt the status quo.

123
Terminology
  • Intra-subject analysis
  • A detailed examination of data and other evidence
    gathered from within a specific learning area.
  • Intra subject analysis can answer questions or
    reveal trends about student achievement or
    teaching within a subject or learning area. For
    example, an analysis of assessment results for
    all students studying a particular subject in a
    school can reveal areas of strength and weakness
    in student achievement and/or in teaching
    practices, etc. Comparison of a schools results
    in a subject with results in that subject in
    other schools is also intra subject analysis.

124
Terminology
  • Longitudinal analysis
  • A detailed examination of data and evidence to
    reveal trends over time.
  • Longitudinal analysis in education is generally
    used to reveal patterns in student achievement,
    behaviour, etc over a number of years. Results
    can reveal the relative impact of different
    learning environments, for example. In this
    resource, it is suggested that longitudinal
    analysis can be applied to teaching practice and
    school processes. For example, the impact of
    modified teaching practices in a subject over a
    number of years can be evaluated by analysing the
    achievements of successive cohorts of students.
Write a Comment
User Comments (0)
About PowerShow.com