Title: Evaluation Overview
1Evaluation Overview Questionnaire Design
2Agenda
- Questions ???
- Thursday prototype implementation decision
- Thursday, sign up for mini-prototype demo with
Rod for the week of Nov 5-8 - Review - Evaluating Input and Output Devices
- Overview of Evaluation
- Questionnaire design
3Project Part 3Making an evaluation plan
- What criteria are important?
- What resources available?
- evaluators, prototype, subjects
- Required authenticity of prototype
4Input and Output Devices
- Look at example of Using Low-Cost Sensing to
Support Nutritional Awareness - DFAB Interaction Framework
- Input
- Output
- Alternative I/O devices and trade-offs
- Evaluate
5Interaction framework
Observation(task language)
6Selecting the Devices
- Look at users
- Consider Task
- Environment
- Use Interaction Framework to assessease of use
and task coveragedirectness of task, actions,
interpretation
7(No Transcript)
8Evaluation Overview
- Explain key evaluation concepts terms.
- Describe the evaluation paradigms techniques
used in interaction design. - Discuss the conceptual, practical and ethical
issues that must be considered when planning
evaluations. - Introduce the DECIDE framework.
9User studies
- User studies involve looking at how people behave
in their natural environments, or in the
laboratory, both with old technologies and with
new ones.
10Some Definitions
- Objective vs. Subjective quantitative measure
vs. opinion - Quantitative vs. Qualitative measurement vs.
descriptions/anecdotes - Laboratory vs. Field or Naturalistic controlled
environment vs. real- world context of use
11Evaluation paradigm
- Any kind of evaluation is guided explicitly or
implicitly by a set of beliefs, which are often
under-pinned by theory. These beliefs and the
methods associated with them are known as an
evaluation paradigm - Ex. Usability testing has a controlled
environment for testing
12Four evaluation paradigms
- quick and dirty informal feedback evaluation
- usability testing
- measure users performance, strongly
controlled - field studies
- natural settings outsider vs. insider
- predictive evaluation experts apply heuristics
models predict performance
13Quick and dirty
- quick dirty evaluation describes the common
practice in which designers informally get
feedback from users or consultants to confirm
that their ideas are in-line with users needs
and are liked. - Quick dirty evaluations are done any time of
design cycle. - The emphasis is on fast input to the design
process rather than carefully documented
findings.
14Usability testing
- Usability testing involves recording typical
users performance on typical tasks in controlled
settings. Field observations may also be used. - As the users perform these tasks they are watched
recorded on video their key presses are
logged. - This data is used to calculate performance times,
identify errors help explain why the users did
what they did. - User satisfaction questionnaires interviews are
used to elicit users opinions.
15Field studies
- Field studies are done in natural settings
- The aim is to understand what users do naturally
and how technology impacts them. - In product design field studies can be used to-
identify opportunities for new technology-
determine design requirements - decide how best
to introduce new technology- evaluate technology
in use.
16Predictive evaluation
- Experts apply their knowledge of typical users,
often guided by heuristics, to predict usability
problems. - Another approach involves theoretically based
models. - A key feature of predictive evaluation is that
users need not be present - Relatively quick inexpensive
17Evaluation Paradigm Characteristics
- Role of Users
- Who controls
- Location
- When used
- Type of Data
- Fed back into design by
- Philosophy
18Overview of techniques
- observing users,
- asking users their opinions
- asking experts their opinions
- testing users performance
- modeling users task performance
19Relationship between Paradigms and Techniques
20DECIDE A framework to guide evaluation
- Determine the goals the evaluation addresses.
- Explore the specific questions to be answered.
- Choose the evaluation paradigm and techniques to
answer the questions. - Identify the practical issues.
- Decide how to deal with the ethical issues.
- Evaluate, interpret and present the data.
21Determine the goals
- What are the high-level goals of the evaluation?
- Who wants it and why?
- The goals influence the paradigm for the study
- Some examples of goals
- Identify the best metaphor on which to base the
design. - Check to ensure that the final interface is
consistent. - Investigate how technology affects working
practices. - Improve the usability of an existing product .
22Explore the questions
- All evaluations need goals questions to guide
them so time is not wasted on ill-defined
studies. - For example, the goal of finding out why many
customers prefer to purchase paper airline
tickets rather than e-tickets can be broken down
into sub-questions- What are customers
attitudes to these new tickets? - Are they
concerned about security?- Is the interface for
obtaining them poor? - What questions might you ask about the design of
a cell phone?
23Choose the evaluation paradigm techniques
- The evaluation paradigm strongly influences the
techniques used, how data is analyzed and
presented. - E.g. field studies do not involve testing or
modeling
24Identify practical issues
- For example, how to
- select users
- stay on budget
- staying on schedule
- find evaluators
- select equipment
25Decide on ethical issues
- Develop an informed consent form
- Participants have a right to- know the goals of
the study- what will happen to the findings-
privacy of personal information- not to be
quoted without their agreement - leave when they
wish - be treated politely
26Evaluate, interpret present data
- The following also need to be considered-
Reliability can the study be replicated?-
Validity is it measuring what you thought?-
Biases is the process creating biases?- Scope
can the findings be generalized?- Ecological
validity is the environment of the study
influencing it - e.g. Hawthorn effect - How data is analyzed presented depends on the
paradigm and techniques used.
27Pilot studies
- A small trial run of the main study.
- The aim is to make sure your plan is viable.
- Pilot studies check- that you can conduct the
procedure- that interview scripts,
questionnaires, experiments, etc. work
appropriately - Its worth doing several to iron out problems
before doing the main study. - Ask colleagues if you cant spare real users.
28Making an evaluation plan
- What criteria are important?
- What resources available?
- evaluators, prototype, subjects
- Required authenticity of prototype
29Evaluation techniques
- Questionnaire
- Interviews
- Think aloud (protocol analysis)
- Cognitive walkthrough
- Predictive modeling
- Heuristic evaluation
- Empirical user studies
30Classifying Techniques
- How/When it is used?
- Formative
- Summative
- What data obtained
- Quantitative
- Qualitative
31Questionnaire Design
- Summative or formative
- Quantitative or qualitative
- Usually inexpensive way to get lots of information
32Goals of Questionnaire
- A good questionnaire requires design
- High-level goals
- What questions are you trying to answer?
- Who are you trying to get answers from?
33Contents of a questionnaire
- General/Background info
- name, experience
- Objective
- Open-ended/subjective
34Advice on survey design
- Take your own survey first
- Know what answers you are trying to elicit
- Too long, and youll be sorry
35Background examples
- What is your age?
- What is your major course of study?
- Have you ever worked at a restaurant?
Potential problems?
36Objective questions
- Good for gathering quantitative trends
- When taking notes in class what percentage of
what the instructor writes do you write in your
own notes?
Potential problems?
37Form of response
- Questionnaire format can include- yes, no
checkboxes- checkboxes that offer many options-
Likert rating scales- semantic scales-
open-ended responses
38Possible improvement
- Multiple choice (choose one)
- How much of what the lecturer writes in class do
you record in your notes? - ___ More than what (s)he writes
- ___ Everything (s)he writes
- ___ Some of what (s)he writes
- ___ None of what (s)he writes
- ___ Other, please specify
39A Likert version
- In taking notes in this class, I always write
down everything the instructor writes down on the
board. - 1 2 3 4 5
- Strongly Agree Neutral Disagree Strongly
- Disagree Agree
40Subjective
- Good for exploring richer explanations
- When something important is presented in class,
describe how you signify its occurrence in your
notes.
Potential problems?
Improvements?
41More advice
- see Web primers
- on-line questionnaire design resource
42Clarity is Important
- Questions must be clear, succinct, and
unambiguous - To be eligible, your mother and your father must
both be living and you must maintain regular
contact with both of them.
43Avoid Question Bias
- Leading questions unnecessarily force certain
answers. - Do you think parking on campus can be made
easier? - What is your overall impression of
- 1.Superb
- 2.Excellent
- 3.Great
- 4.Not so Great
44Be Aware of Connotations
- Do you agree with the NFL owners decision to
oppose the referees pay request? - Do you agree with the NFL owners decision in
regards to the referees pay demand? - Do you agree with the NFL owners decision in
regards to the referees suggested pay?
45Handle Personal Info Carefully
- Ask questions subjects would not mind answering
honestly. - What is your waist size?
- All men wear a 32!!!
- If subjects are uncomfortable, you will lose
their trust. - Ask only what you really need to know.
46Avoid Hypotheticals
- Avoid gathering information on uninformed
opinions. - Subjects should not be asked to consider
something theyve never thought about (or know or
understand). - Would a device aimed to make cooking easier help
you?
47Thursday
- Prototype implementation plan due
- Interviews, Think Aloud and Cognitive Walkthroughs