Title: An evaluation framework
1An evaluation framework
2The aims
- Explain key evaluation concepts terms.
- Describe the evaluation paradigms techniques
used in interaction design. - Discuss the conceptual, practical and ethical
issues that must be considered when planning
evaluations. - Introduce the DECIDE framework.
3Some definitions/contrasts
- Evaluation study design,
- Methods/techniques for data gathering,
- Methods/techniques for data analysis
- Research vs. Industry
- Quantitative vs. Qualitative
- Objective vs. Subjective
- Experimental vs. Field studies
4Evaluation paradigm
- Any kind of evaluation is guided explicitly or
implicitly by a set of beliefs, which are often
under-pined by theory. These beliefs and the
methods associated with them are known as an
evaluation paradigm
5User studies
- User studies involve looking at how people behave
in their natural environments, or in the
laboratory, both with old technologies and with
new ones.
6Four evaluation paradigms
- quick and dirty
- usability testing
- field studies
- predictive evaluation
Se tabellerna sid 344, 347
7Quick and dirty
- quick dirty evaluation describes the common
practice in which designers informally get
feedback from users or consultants to confirm
that their ideas are in-line with users needs
and are liked. - Quick dirty evaluations are done any time.
- The emphasis is on fast input to the design
process rather than carefully documented
findings.
8Usability testing
- Usability testing involves recording typical
users performance on typical tasks in controlled
settings. Field observations may also be used. - As the users perform these tasks they are watched
recorded on video their key presses are
logged. - This data is used to calculate performance times,
identify errors help explain why the users did
what they did. - User satisfaction questionnaires interviews are
used to elicit users opinions.
9Field studies
- Field studies are done in natural settings
- The aim is to understand what users do naturally
and how technology impacts them. - In product design field studies can be used to-
identify opportunities for new technology-
determine design requirements - decide how best
to introduce new technology- evaluate technology
in use.
10Predictive evaluation
- Experts apply their knowledge of typical users,
often guided by heuristics, to predict usability
problems. - Another approach involves theoretically based
models. - A key feature of predictive evaluation is that
users need not be present - Relatively quick inexpensive
11Overview of techniques
- observing users (chapter 12),
- asking users their opinions (chapter 13),
- asking experts their opinions (chapter 13),
- testing users performance (chapter 14),
- modeling users task performance (chapter 14)
12DECIDE A framework to guide evaluation
- Determine the goals the evaluation addresses.
- Explore the specific questions to be answered.
- Choose the evaluation paradigm and techniques to
answer the questions. - Identify the practical issues.
- Decide how to deal with the ethical issues.
- Evaluate, interpret and present the data.
13Determine the goals
- What are the high-level goals of the evaluation?
- Who wants it and why?
- The goals influence the paradigm for the study
- Some examples of goals
- Identify the best metaphor on which to base the
design. - Check to ensure that the final interface is
consistent. - Investigate how technology affects working
practices. - Improve the usability of an existing product .
14Explore the questions
- All evaluations need goals questions to guide
them so time is not wasted on ill-defined
studies. - For example, the goal of finding out why many
customers prefer to purchase paper airline
tickets rather than e-tickets can be broken down
into sub-questions- What are customers
attitudes to these new tickets? - Are they
concerned about security?- Is the interface for
obtaining them poor? - What questions might you ask about the design of
a cell phone?
15Choose the evaluation paradigm techniques
- The evaluation paradigm strongly influences the
techniques used, how data is analyzed and
presented. - E.g. field studies do not involve testing or
modeling
16Identify practical issues
- For example, how to
- select users
- stay on budget
- staying on schedule
- find evaluators
- select equipment
17Decide on ethical issues
- Develop an informed consent form
- Participants have a right to- know the goals of
the study- what will happen to the findings-
privacy of personal information- not to be
quoted without their agreement - leave when they
wish - be treated politely
18Evaluate, interpret present data
- How data is analyzed presented depends on the
paradigm and techniques used. - The following also need to be considered-
Reliability can the study be replicated?-
Validity is it measuring what you thought?-
Biases is the process creating biases?- Scope
can the findings be generalized?- Ecological
validity is the environment of the study
influencing it - e.g. Hawthorn effect
19Pilot studies
- A small trial run of the main study.
- The aim is to make sure your plan is viable.
- Pilot studies check- that you can conduct the
procedure- that interview scripts,
questionnaires, experiments, etc. work
appropriately - Its worth doing several to iron out problems
before doing the main study. - Ask colleagues if you cant spare real users.
20Key points
- An evaluation paradigm is an approach that is
influenced by particular theories and
philosophies. - Five categories of techniques were identified
observing users, asking users, asking experts,
user testing, modeling users. - The DECIDE framework has six parts - Determine
the overall goals - Explore the questions that
satisfy the goals - Choose the paradigm and
techniques - Identify the practical issues -
Decide on the ethical issues - Evaluate ways to
analyze present data - Do a pilot study
21Observing users
22The aims
- Discuss the benefits challenges of different
types of observation. - Describe how to observe as an on-looker, a
participant, an ethnographer. - Discuss how to collect, analyze present
observational data. - Examine think-aloud, diary studies logging.
- Provide you with means in doing observation and
critiquing observation studies.
23What and when to observe
- Goals questions determine the paradigms and
techniques used. - Observation is valuable any time during design.
- Quick dirty observations early in design
- Observation can be done in the field (i.e., field
studies) and in controlled environments (i.e.,
usability studies) - Observers can be- outsiders looking on-
participants, i.e., participant observers-
ethnographers
24Frameworks to guide observation
- - The person. Who? - The place. Where?- The
thing. What? - The Goetz and LeCompte (1984) framework- Who
is present? - What is their role? - What is
happening? - When does the activity occur?-
Where is it happening? - Why is it happening? -
How is the activity organized?
25The Robinson (1993) framework
- Space. What is the physical space like?
- Actors. Who is involved?
- Activities. What are they doing?
- Objects. What objects are present?
- Acts. What are individuals doing?
- Events. What kind of event is it?
- Goals. What do they to accomplish?
- Feelings. What is the mood of the group and of
individuals?
26You need to consider
- Goals questions
- Which framework techniques
- How to collect data
- Which equipment to use
- How to gain acceptance
- How to handle sensitive issues
- Whether and how to involve informants
- How to analyze the data
- Whether to triangulate
27Observing as an outsider
- As in usability testing
- More objective than participant observation
- In usability lab equipment is in place
- Recording is continuous
- Analysis observation almost simultaneous
- Care needed to avoid drowning in data
- Analysis can be coarse or fine grained
- Video clips can be powerful for telling story
28Participant observation ethnography
- Debate about differences
- Participant observation is key component of
ethnography - Must get co-operation of people observed
- Informants are useful
- Data analysis is continuous
- Interpretivist technique
- Questions get refined as understanding grows
- Reports usually contain examples
29Data collection techniques
- Notes still camera
- Audio still camera
- Video
- Tracking users- diaries- interaction logging
30Data analysis
- Qualitative data - interpreted used to tell the
story about what was observed. - Qualitative data - categorized using techniques
such as content analysis. - Quantitative data - collected from interaction
video logs. Presented as values, tables, charts,
graphs and treated statistically.
31Interpretive data analysis
- Look for key events that drive the groups
activity - Look for patterns of behavior
- Test data sources against each other -
triangulate - Report findings in a convincing and honest way
- Produce rich or thick descriptions
- Include quotes, pictures, and anecdotes
- Software tools can be useful e.g., NUDIST,
Ethnograph (see URL resource list for examples)
32Looking for patterns
- Critical incident analysis
- Content analysis
- Discourse analysis
- Quantitative analysis - i.e., statistics
33Key points
- Observe from outside or as a participant
- Analyzing video and data logs can be
time-consuming. - In participant observation collections of
comments, incidents, and artifacts are made.
Ethnography is a philosophy with a set of
techniques that include participant observation
and interviews. - Ethnographers immerse themselves in the culture
that they study.