Title: Quality appraisal of qualitative Research
1Quality appraisal of qualitative Research
2Introduction of participants
- Name and surname
- Job title/responsibilities
- Department/section
- Length of time in post
- Brief review of disciplinary background, training
- Research experience particularly qualitative
research - Types of qualitative research with which involved
current or past - Involvement in any qualitative evaluation process
- What would you hope to learn from a qualitative
study?
3Paradigm and method the relationship between
philosophy and research practice
- What is the nature of reality?
- What kind of knowledge can we have about reality?
- How can we investigate reality?
- What is the picture that we paint of reality?
4Key terms
- Ontology basic assumptions about the nature of
reality. - Epistemology basic assumptions about what we
can know about reality, and about the
relationship between knowledge and reality. - Paradigm - Overarching perspective concerning
appropriate research practice, based on
ontological and epistemological assumptions - Methodology - Specifies how the researcher may go
about practically studying whatever he / she
believes can be known.
5Ontology
- What is the nature of reality?
- ? Positivist paradigm (Realism)
- Stable, law-like reality out there
- ? Interpretivist paradigm
- Multiple, emergent, shifting reality
6Epistemology
- What is knowledge?
- What is the relationship between
- knowledge and reality?
- Positivism
- ? Meaning exists in the world.
- ? Knowledge reflects reality.
- Interpretivism
- ? Meaning exists in our interpretations
- ? Knowledge is interpretation
-
7 Ontology, Epistemology
Scientific paradigm
Methodology
Knowledge
8Paradigms in social science research
- Three basic paradigms
- Positivism Interpretivism Constructionism
9Positivism
- Independence
- Value-free
- Causality
- Hypothesis and Deduction
- Operationalization
- Reductionism
- Generalization
- Cross-sectional analysis
10Methodology The Positivist Paradigm
- ? Positivist research involves precise
empirical observations of individual behaviour in
order to discover probabilistic causal laws
that can be used to predict general patterns of
human activity (Neuman, 1997 63) - ? Objective, value-free discovery
11Methodology The Interpretive Paradigm
- The study of social life involves skills that
are more like the skills of literary or dramatic
criticism and of poetics than the skills of
physical scientists. (Rom Harre, quoted in
Phillips, 1987, p105) - Importance of the researchers perspective and
the interpretative nature of social reality.
12Knowledge
- Positivism
Interpretivism - Accurate knowledge Knowledge provides
- exactly reflects the suggestive interpretations
- world as it is. by particular people at
- particular times.
13key characteristics of qualitative research (1)
- A concern with meanings, especially the
subjective meanings of participants -
- A concern with exploring phenomena from the
perspectives of those being studied -
- An awareness and consideration of the
researchers role and perspective (reflexivity) - ability to preserve and explore context (at the
individual level and in the sense of
understanding broader social and organizational
contexts) - Answering what is, how and why questions
- Use of unstructured methods which are sensitive
to the social context of the study - Naturalistic inquiry (Study real-world situations
as they unfold naturallyno manipulation or
intervention) - Prolonged immersion in, or contact with, the
research setting - The absence of methodological orthodoxy and the
use of a flexible (emergent) research strategy
14key characteristics of qualitative research (2)
- Capture of the data which are detailed, rich and
complex (use of thick description) - A mainly inductive rather than deductive analytic
process - Attention paid to emergent categories and
theories rather than sole reliance on a priori
concepts and ideas - The collection and analysis of data that are
mainly in the form of words (textual data) and
images rather than numbers -
- A commitment to retaining diversity and
complexity in the analysis - Development rather than setting of hypotheses
- Explanations offered at the level of meaning, or
in terms of local causality (why certain
interactions do or do not take place) rather than
surface workings or context-free laws - Holistic perspective (study the whole phenomenon)
- Employs variety of methods including exploratory
interviews focus groups observation
(participatory and non-participatory)
conversation discourse and narrative analysis
and documentary and video analysis.
15(No Transcript)
16Selection of research strategy
Strategy Form of research question Control over behavioural events Focus on contemporary events
Experiment how, why, who, what, where Yes yes
Survey how many, how much No yes
Case study how, why No yes/no
History how, why No no
Archival analysis Who, what, where, how many, how much No yes
Source Yin (2003) Source Yin (2003) Source Yin (2003) Source Yin (2003)
17Sampling
- Quantitative Statistical sampling (maximizing
external validity or Generalization) - Qualitative Theoretical sampling (Glaser and
Straus, 1967) or purposive sampling (Lincoln and
Guba, 1985), rather than conventional or
statistical sampling - In theoretical sampling, the relation between
sampling and explanation is iterative and
theoretically led - The purpose of purposive sampling to maximise
information, not to facilitate generalisation. - Deliberate inclusion of a wide range of types of
informants with access to important sources of
knowledge - The criterion used to determine when to stop
purposive sampling is informational redundancy,
not a statistical confidence level (Data
Saturation)
18Data collection
- All qualitative data collection methods involve
collecting data in the form of words, talk,
experience and actions (some degree of
interaction between researcher and participants
with the exception of document analysis) - Interviewing (from unstructured to totally
structured) - Focus group (ideal between 6-8 people)
- Observation (participant observation and
non-participant observation) - unstructured diary-keeping and journals (where
these have been written specifically for a
research project) - Analysis of existing documents or audio-visual
media (contemporary or historical sources) - Discourse analysis
- Conversation analysis
- Biographical methods such as life histories
19Sources of evidence and their strengths and
weaknesses
Source of Evidence Strengths Weaknesses
Documentation stable - repeated review unobtrusive - exist prior to study exact - names etc. broad coverage - extended time span retrievability - difficult biased selectivity reporting bias - reflects author bias access - may be blocked
Archival Records same as above precise and quantitative same as above privacy might inhibit access
Interviews targeted insightful - provides perceived causal inferences bias due to poor questions response bias incomplete recollection reflexivity - interviewee expresses what interviewer wants to hear
Direct Observation reality - covers events in real time contextual - covers event context time-consuming selectivity - might miss facts reflexivity - observer's presence might cause change cost - observers need time
Participant Observation same as above insightful into interpersonal behaviour same as above bias due to investigator's actions
Physical Artefacts insightful into cultural features insightful into technical operations selectivity availability
Source Yin (2003), Page 80 Source Yin (2003), Page 80 Source Yin (2003), Page 80
20Data Collection Methods
- Interview
- Unstructured
- Area of interest may be specified, but all else
occurs impromptu - Partially structured
- Area is chosen and questions are formulated a
priori, but interviewer decides on the order as
interview occurs - Semistructured
- Area, questions, and order are predetermined.
Questions are open-ended, interviewer records
essence of response - Structured
- Area, questions, and order are predetermined.
Questions are open-ended, responses are coded by
interviewer as given - Totally structured
- Area, questions, and order are predetermined.
Respondent is provided with alternatives for each
question (i.e., multiple-choice)
21Data Collection Methods
- Focus group (4 -12 participants)
- Capitalize on communication between research
participants to generate data - Highlighting the respondents attitudes,
priorities, language and framework of
understanding -
22Data Collection Methods
- Observation
- Nonparticipant
- Unobtrusive (to greatest extent possible)
- Researcher not engaged in activities of
group/situation under study - Participant
- Researcher is engaged in activities of
group/situation under study - Participant-as-observer, Observer-as-participant
- Researcher has primary role, but moonlights as
other
23Data Collection Methods
- Historical/archival
- Uses existing records
- Written documents
- Video recordings or film
- Audio recordings
- Combination
24Data analysis
- Several different strategies for analysis
- An explanation for a particular phenomenon,
experience or institution rather than a mere
description of a range of observations, responses
or narrative accounts of subjective experience - Exploring concepts and establishing linkage
between concepts implied in the research question
and the data-set and provides explanations for
pattern or ranges of reasons or observations from
different sources
25Data analysis
- Starting from data collection phase (interim
analysis or sequential analysis) - Content analysis (often media and mass
communications- cots items) - Inductive (categories derive gradually from data)
or deductive (at the beginning or part way
through the analysis as a way of approaching
data) - Grounded theory developing hypotheses from the
ground or research field upwards rather
defining them a priori - Grounded theory (the inductive process of coding
incidents in the data and identifying analytical
categories as they emerged form the data) - Deductive forms are increasingly being used in
applied qualitative analysis (e.g. framework
approach both deductive and inductive
approaches)
26Framework Analysis
- 1) Familiarization
- 2) Identifying the thematic framework
- 3) Indexing
- 4) Charting
- 5) Mapping and interpretation
27Computer Assisted Qualitative Data Analysis
(CAQDAS)
- software packages to facilitate the management,
processing and analysis of qualitative data.
Examples include - 1- ETHNOGRAPH
- 2-ATLAS-Ti
- 3-NUDIST
- 4-QSR
- 5-NVivo
- None of the software packages is able to do the
analysis and the researcher is still responsible
for developing a coding scheme, interpreting all
the data and formulating conclusions!
28Reporting
- Clear links between original data, interpretation
and conclusion - Clear and Coherent
- Selection and presentation of appropriate and
adequate data - Presenting the emergence of themes and concepts
29Reflexivity
- Conducting qualitative research exposes the
personal influence of the researcher far more
than quantitative methods, as the researcher is
central to data collection, analysis and
interpretation. Within the qualitative research
paradigm, a high degree of reflexivity on the
part of the researcher is required throughout the
research process. - Researchers need to take in to account the way
that their own background and social position, a
priori knowledge and assumptions affect all
aspects of research development and design, data
collection, analysis and interpretation (Jaye,
2002).
30Reflexivity
- Mays and Pope (2000) relate the concept of
reflexivity to sensitivity to the way in which
the researcher and research process have both
formed the data. Through personal accounting, the
researchers become aware of how their own
position (e.g. gender, race, class, and power
within the research process) and how these
factors necessarily shape all stages of data
collection and analysis (Hertz, 1997).
31How Do We Evaluate Outputs of Qualitative
Research?
- Conceptual themes
- Contributory
- Defensible in design
- Rigorous in conduct
- Credible in claim
Spencer, L., Ritchie, J., Lewis, J.,
Dillon, L. (2003). Quality in Qualitative
Evaluation A framework for assessing research
evidence. Government Chief Social Researchers
Office, Cabinet Office, United Kingdom.
32Identification of some underlying central
concerns and principles
- Defensibility of design
- By providing a research strategy that can
address the evaluative questions posed - Rigour of conduct
- Through the systematic and transparent
collection, analysis and interpretation of
qualitative data - Credibility of claims
- Through offering well-founded and plausible
arguments about the significance of the evidence
generated -
- Contribution to the knowledge and understanding
- (e.g. about theory, policy, practice, or a
particular substantive field)
33Lincoln and Gubas naturalistic
criteria(Trustworthiness)
Aspect Scientific term (quantitative) Naturalistic term (Qualitative)
Truth value Internal validity Credibility
Applicability External validity or generalisability Transferability
Consistency Reliability Dependability
Neutrality Objectivity Confirmability
34Triangulation
- Triangulation is a strategy which can be used to
corroborate the validity of research findings.
Its types as defined by Denzin (1984) - 1) Data sources Triangulation Collection of data
from various relevant groups and stakeholders
with an interest in the phenomenon under study. - 2) Investigator triangulation The use of several
researchers to study the same phenomenon using
the same method. - 3) Theory triangulation Refers to the strategy
used when different investigators with different
perspectives interpret the same data/results
(Multidisciplinary team). - 4) Methodological triangulation Utilization of
various methodologies to examine a particular
phenomenon.
35Validity and reliability issues
Techniques used to meet the criteria Phase of study
Construct Validity Data triangulation Maintaining a chain of evidence Have key informants review draft case study report Seminar presentation Data Collection Data collection/ordering Composition Analysis/Composition
Internal Validity Explanation building Peer Debriefing Pre publishing Pattern matching Design/Analysis Analysis/debriefing Composition Data analysis
External Validity Relate to extant literature Design/Analysis
Reliability Case study protocol Establish a case study database Keep a research diary Data collection All phases All phases
Source Yin (2003) Source Yin (2003) Source Yin (2003)
36Quality standards in qualitative research
- Widespread concerns about quality
- Rigour
- Robust
- Relevance
- Utility of research
37Addressing the holy trinity
- no escape from holy trinity (validity,
reliability and objectivity) - identified underlying themes
- internal validity (procedural/methodologicalinter
pretive / accuracy or credibility of findings
relational / outcomes in relation to
participants) - external validity (relevance generalisability
auditability contextual detail) - reliability (replication consistency
auditability) - objectivity (neutral/value free auditability
reflexivity) - soundness / well-foundedness vs goodness /
worthwhileness
38The whole idea of qualitative standards or
criteria
- Many different positions
- rejection of criteria for philosophical or
methodological reasons - proposal of alternative criteria (unrelated to
notions of rigour or credibility) - proposal of parallel criteria (addressing
notions of rigour or credibility) - adoption of traditional scientific criteria (to
be applied rather differently)
39The idea of criteria (contd.)
- concern about rigid checklists
- concern about tick box mentality
- avoided the term criteria
- adopted series of flexible open-ended questions
around guiding principles and quality issues - retained centrality of experience and judgement,
not mechanistic rule-following - Qualitative research should be assessed on its
own terns within premises that are central to
its purpose, nature and conduct
40The debate
- Against universal criteria
- Different philosophical assumptions of
qualitative methods. - The diversity of qualitative methods makes a
universal criteria irrelevant. - Qualitative studies are not feasible for
systematic reviews.
- Favour universal criteria
- Research question dictates the design.
- All findings should emerge from the participants
experiences (credibility). - Urgent need to develop a consensus around what
would constitute a good enough appraisal tool
for qualitative and/or multi-method studies.
41Developing consensus?
- Over 100 quality appraisal forms to evaluate
qualitative research. - Discrepancies of how these tools attempt to
appraise the quality of qualitative research. - Many do not distinguish between different study
designs, theoretical approaches, and standards
for rigor, credibility and relevance. -
- The majority of these appraisal tools have not
themselves been systematically tested.
42Why develop frameworks?
- Growing emphasis on ways of formalising quality
standards - Appraising the existing research literature
- Growing use of systematic review
- No explicitly agreed standards regarding what
constitute quality in qualitative policy
evaluation method - No agreed formal criteria for judging the quality
of qualitative evaluation research
43Why develop frameworks?
- Produce a set of criteria that researchers and
policy makers can use to assess the extent to
which a particular study demonstrate attention to
key quality issues - Provide guidance on how standards can be used in
appraising individual studies -
- For the use of commissioners and managers of
research funders of research, government-based
policy makers who use qualitative research
experts and academics and researchers conducting
qualitative research
44Critical Appraisal Skills Programme (CASP)
- 1) Was there a clear statement of the aims of the
research? - Consider
- what the goal of the research was
- why it is important
- its relevance
45Critical Appraisal Skills Programme (CASP)
- 2) Is a qualitative methodology appropriate?
- Consider
- if the research seeks to interpret or
illuminate the actions and/or subjective
experiences of research participants
46CASP- Appropriate research design
- 3) Was the research design appropriate to
address the aims of the research? - Consider
- if the researcher has justified the research
design (e.g. have they discussed how they decided
which methods to use?)
47CASP-Sampling
- 4) Was the recruitment strategy appropriate to
the aims of the research? - Consider
- if the researcher has explained how the
participants were selected - if they explained why the participants they
selected were the most appropriate to provide
access to the type of knowledge sought by the
study - if there are any discussions around recruitment
(e.g. why some people chose not to take part)
48CASP-Data collection (1)
- 5) Were the data collected in a way that
addressed the research issue? - Consider
- if the setting for data collection was
justified - if it is clear how data were collected (e.g.
focus group, semi-structured interview etc) - if the researcher has justified the methods
chosen
49CASP-Data collection (2)
- Consider
- if the researcher has made the methods explicit
(e.g. for interview method, is there an
indication of how interviews were conducted, did
they used a topic guide?) - if methods were modified during the study. If
so, has the researcher explained how and why? - if the form of data is clear (e.g. tape
recordings, video material, notes etc) - if the researcher has discussed saturation of
data
50CASP-Reflexivity
- 6) Has the relationship between researcher and
participants been adequately considered? - Consider whether it is clear
- if the researcher critically examined their own
role, potential bias and influence during - formulation of research questions
- data collection, including sample recruitment
and choice of location - how the researcher responded to events during
the study and whether they considered the
implications of any changes in the research design
51CASP-Ethical Issues
- 7) Have ethical issues been taken into
consideration? - Consider
- if there are sufficient details of how the
research was explained to participants for the
reader to assess whether ethical standards were
maintained - if the researcher has discussed issues raised
by the study (e. g. issues around informed
consent or confidentiality or how they have
handled the effects of the study on the
participants during and after the study) - if approval has been sought from the ethics
committee
52CASP-Data Analysis (1)
- 8) Was the data analysis sufficiently rigorous?
- Consider
- if there is an in-depth description of the
analysis process - if thematic analysis is used. If so, is it
clear how the categories/themes were derived from
the data? - whether the researcher explains how the data
presented were selected from the original sample
to demonstrate the analysis process
53CASP-Data Analysis (2)
- Consider
- if sufficient data are presented to support the
findings - to what extent contradictory data are taken
into account - whether the researcher critically examined
their own role, potential bias and influence
during analysis and selection of data for
presentation
54CASP-Findings
- 9) Is there a clear statement of findings?
- Consider
- if the findings are explicit
- if there is adequate discussion of the evidence
- both for and against the researchers
arguments - if the researcher has discussed the credibility
of - their findings (e.g. triangulation, respondent
- validation, more than one analyst.)
- if the findings are discussed in relation to
the original research questions
55CASP-Value of the research
- 10) How valuable is the research?
- Consider
- if the researcher discusses the contribution
the study makes to existing knowledge or
understanding (e.g. do they consider the findings
in relation to current practice or policy, or
relevant research-based literature?) - if they identify new areas where research is
necessary - if the researchers have discussed whether or
how the findings can be transferred to other
populations or considered other ways the research
may be used
56Quality Appraisal of Qualitative Studies
- 1 Question
- Did the paper address a clear research question
and if so, what was it? - 2 Design
- What was the study design and was this
appropriate to the research question? - In particular, was a qualitative approach
suitable and was the right design used? - 3 Context
- What was the context of the study?
- Was the context of the study adequately well
described that the findings can be related to
other settings? - 4 Sampling
- Did the study include sufficient
cases/settings/observations so that conceptual
rather than statistical generalisations could - be made?
- 5 Data collection
- Was the data collection process systematic,
thorough, auditable and appropriate to the
research question?Were attempts - made to identify and explore disconfirming cases?
- 6 Data analysis
- Were data analysed systematically and rigorously?
- Did the analysis take account of all
observations? - Were sufficient data given to present evident
relationship between evidence and interpretation? - How were disconfirming observations dealt with?
57The structure of the framework
- Designed with a particular focus on the methods
used most extensively (interviews focus groups
observation and documentary analysis), however,
has application for a wider range of qualitative
methods (e.g. linguistic analysis historical and
archival analysis multimedia methods etc.). The
complementary criteria for the latter should be
added. -
- Three tiers
- 4 central principles
- 18 appraisal questions (indicative,
discretionary, and avoiding yes/no answers, no
scoring) - series of quality indicators (illustrative rather
than exhaustive or prescriptive, no scoring)
58The outline of the Framework
- Assessing outputs
- Covering all the main stages and processes
involved in qualitative inquiry, but with heavy
emphasis on analysis and findings
59The framework Appraisal questions
- Coverage of questions
- Design (1)
- Sample (2)
- Data collection (1)
- Data analysis (4)
- Findings (5)
- Reporting (2)
- Reflexivity and neutrality (1)
- Ethics (1)
- Auditability (1)
60QF-Findings (1-5)
- 1) How Credible are the findings?
- 2) How has knowledge/understanding been extended
by the research? - 3) How well does the evaluation address its
original aims and purpose? - 4) Scope for drawing wider inference - how well
is this explained? - 5) How clear is the basis of evaluative appraise?
61QF-Design (6)
- 6) How defensible is the research design?
62QF-Sample (7-8)
- 7) How well defended is the sample design/ target
selection of cases/ documents? - 8) Sample composition/ case inclusion- how well
is the eventual coverage described?
63QF-Data Collection (9)
- 9) How well was the data collection carried out?
64QF- Analysis (10-13)
- 10) How well has the approach to, and formulation
of, the analysis been conveyed? - 11) Contexts of the data sources how well are
they retained and portrayed? - 13) How well has diversity of perspective and
content been explored?
65QF- Reporting (14-15)
- 14) How well has detail, depth and complexity
(i.e richness) of the data been conveyed? - 15) How clear and coherent is the reporting?
66QF-Reflexivity Neutrality (16)
- 16) How clear are the assumptions/ theoretical
perspectives/ values that have shaped the form
and output of the evaluation?
67QF-Ethics (17)
- 17) What evidence is there of attention to
ethical issues?
68QF-Auditability (18)
- 18) How adequately has the research process been
documented?